Problem Statement: Chest Radiograph is the most commonly used or performed diagnostic imaging Technology. Due to high volume of chest radiography, it could be very time consuming and intensive for the radiologists to review each image manually. As such, an automated solution is ideal to locate the position of inflammation in an image. By having such an automated pneumonia screening system, this can assist physicians to make better clinical decisions.
Business Domain Value: Automating Pneumonia screening in chest radiographs, providing affected area details through bounding box. Assist physicians to make better clinical decisions or even replace human judgement in certain functional areas of healthcare (eg, radiology). Guided by relevant clinical questions, powerful AI techniques can unlock clinically relevant information hidden in the massive amount of data, which in turn can assist clinical decision making.
Details about the data and dataset files are given in below link, https://www.kaggle.com/c/rsna-pneumonia-detection-challenge/data
Purpose of this project: Find out a patient with pnuemonia disease. There are some patients, which have some symptoms but not sure they might get affected by this diesea or not. This project also help to find out those patients, who have some symptoms and might get affected by this disease.
# Mounting Google CoLab
from google.colab import drive
drive._mount('/content/drive/',force_remount=True)
Mounted at /content/drive/
gpu_info = !nvidia-smi
gpu_info = '\n'.join(gpu_info)
if gpu_info.find('failed') >= 0:
print('Not connected to a GPU')
else:
print(gpu_info)
Fri Nov 19 16:46:04 2021
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 495.44 Driver Version: 460.32.03 CUDA Version: 11.2 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 Tesla P100-PCIE... Off | 00000000:00:04.0 Off | 0 |
| N/A 36C P0 26W / 250W | 0MiB / 16280MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| No running processes found |
+-----------------------------------------------------------------------------+
# Memory capacity etails
from psutil import virtual_memory
ram = virtual_memory().total / 1e9
print('Your runtime has {:.1f} gigabytes of available RAM\n'.format(ram))
if ram < 20:
print('To enable a high-RAM runtime, select the Runtime > "Change runtime type"')
print('menu, and then select High-RAM in the Runtime shape dropdown. Then, ')
print('re-execute this cell.')
else:
print('You are using a high-RAM runtime!')
Your runtime has 27.3 gigabytes of available RAM You are using a high-RAM runtime!
import cv2
import pickle
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import pylab
!pip install pydicom
!pip install mrcnn
import pydicom as pyd
import seaborn as sns
from tqdm import tqdm
from sklearn.preprocessing import OneHotEncoder
from sklearn.metrics import confusion_matrix
from keras.models import Model, load_model
from keras.layers import Dense, Input, Conv2D, MaxPool2D, Flatten
from keras.preprocessing.image import ImageDataGenerator
from glob import glob
import os
from matplotlib.patches import Rectangle
from mrcnn.config import Config
from collections import defaultdict
import matplotlib.patches as patches
from matplotlib.patches import Rectangle
import tensorflow as tf
%matplotlib inline
from tensorflow.keras.applications.mobilenet import MobileNet
from tensorflow.keras.layers import Concatenate, UpSampling2D, Conv2D, Reshape, BatchNormalization
from tensorflow.keras.models import Model
from tensorflow.keras.losses import binary_crossentropy
from tensorflow.keras.optimizers import Adam
import skimage
from skimage.transform import resize
from skimage import feature, filters
from sklearn.model_selection import train_test_split
from sklearn.metrics import confusion_matrix
from sklearn.metrics import classification_report
import random
import pickle
from sklearn.metrics import roc_curve,auc,precision_recall_curve,classification_report
Collecting pydicom
Downloading pydicom-2.2.2-py3-none-any.whl (2.0 MB)
|████████████████████████████████| 2.0 MB 5.0 MB/s
Installing collected packages: pydicom
Successfully installed pydicom-2.2.2
Collecting mrcnn
Downloading mrcnn-0.2.tar.gz (51 kB)
|████████████████████████████████| 51 kB 230 kB/s
Building wheels for collected packages: mrcnn
Building wheel for mrcnn (setup.py) ... done
Created wheel for mrcnn: filename=mrcnn-0.2-py3-none-any.whl size=54930 sha256=b6be66592942ed50ead6f01269ce38a1924bde1d5dc207d6829ecf0399de9f2d
Stored in directory: /root/.cache/pip/wheels/1d/94/0d/03ff96abc43d2d6c8299a92cbb4eced2a1eda3ca7911c19427
Successfully built mrcnn
Installing collected packages: mrcnn
Successfully installed mrcnn-0.2
#Unzip the .zip file
#from zipfile import ZipFile
#with ZipFile('/content/drive/MyDrive/AI_ML_Projects/Capstone Project/rsna-pneumonia-detection-challenge.zip', 'r') as z:
#z.extractall()
# Setting path
root_path = '/content/drive/MyDrive/AI_ML_Projects/Capstone Project/'
os.chdir(root_path)
# Checking file format (.dcm) in the train images folder
for file in os.listdir((os.path.join(root_path,'stage_2_train_images'))):
if not file.endswith('.dcm'):
print(file)
#else:
#print('All files inside Train Images folder are .dcm format')
# Checking file format (.dcm) in the test images folder
for file in os.listdir((os.path.join(root_path,'stage_2_test_images'))):
if not file.endswith('.dcm'):
print(file)
#else:
#print('All files inside Test Images folder are .dcm format')
label_meta_data = pd.read_csv('stage_2_detailed_class_info.csv')
train_labels_df = pd.read_csv('stage_2_train_labels.csv')
print('Size of Dataset 1: ',train_labels_df.shape)
print('Size of Dataset 2: ',label_meta_data.shape)
print('Number of Unique X-Rays in Dataset 1 : ',train_labels_df['patientId'].nunique())
print('Number of Unique X-Rays in Dataset 2 : ',label_meta_data['patientId'].nunique())
Size of Dataset 1: (30227, 6) Size of Dataset 2: (30227, 2) Number of Unique X-Rays in Dataset 1 : 26684 Number of Unique X-Rays in Dataset 2 : 26684
train_labels_df.drop_duplicates(inplace=True)
label_meta_data.drop_duplicates(inplace=True)
print('Size of Dataset 1: ',train_labels_df.shape)
print('Size of Dataset 2: ',label_meta_data.shape)
print('Number of Unique X-Rays in Dataset 1 : ',train_labels_df['patientId'].nunique())
print('Number of Unique X-Rays in Dataset 2 : ',label_meta_data['patientId'].nunique())
Size of Dataset 1: (30227, 6) Size of Dataset 2: (26684, 2) Number of Unique X-Rays in Dataset 1 : 26684 Number of Unique X-Rays in Dataset 2 : 26684
label_meta_data.head(10)
| patientId | class | |
|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | No Lung Opacity / Not Normal |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | No Lung Opacity / Not Normal |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | No Lung Opacity / Not Normal |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | Normal |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | Lung Opacity |
| 6 | 00569f44-917d-4c86-a842-81832af98c30 | No Lung Opacity / Not Normal |
| 7 | 006cec2e-6ce2-4549-bffa-eadfcd1e9970 | No Lung Opacity / Not Normal |
| 8 | 00704310-78a8-4b38-8475-49f4573b2dbb | Lung Opacity |
| 10 | 008c19e8-a820-403a-930a-bc74a4053664 | No Lung Opacity / Not Normal |
| 11 | 009482dc-3db5-48d4-8580-5c89c4f01334 | Normal |
train_labels_df.head(10)
| patientId | x | y | width | height | Target | |
|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | NaN | NaN | NaN | NaN | 0 |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | NaN | NaN | NaN | NaN | 0 |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | NaN | NaN | NaN | NaN | 0 |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | NaN | NaN | NaN | NaN | 0 |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.0 | 152.0 | 213.0 | 379.0 | 1 |
| 5 | 00436515-870c-4b36-a041-de91049b9ab4 | 562.0 | 152.0 | 256.0 | 453.0 | 1 |
| 6 | 00569f44-917d-4c86-a842-81832af98c30 | NaN | NaN | NaN | NaN | 0 |
| 7 | 006cec2e-6ce2-4549-bffa-eadfcd1e9970 | NaN | NaN | NaN | NaN | 0 |
| 8 | 00704310-78a8-4b38-8475-49f4573b2dbb | 323.0 | 577.0 | 160.0 | 104.0 | 1 |
| 9 | 00704310-78a8-4b38-8475-49f4573b2dbb | 695.0 | 575.0 | 162.0 | 137.0 | 1 |
Compare the labels and class information for possible join
print("Shape of the train labels:", train_labels_df.shape)
print("Shape of the detailed class information:", label_meta_data.shape)
Shape of the train labels: (30227, 6) Shape of the detailed class information: (26684, 2)
Data Inference: A join or merge should typically give us a dataset that has a shape of (30227) assuming we keep all rows and drop the redundant 'patientId' column.
Check uniqueness of the data
Approach:
There could be duplicate patientID entries, resulting to multiple bounding boxes with relative target classification/class information. Compare if the sequence of records are synchronous between the "train labels" and "class information" datasets. If synchronous then a simple join can be performed on the index
Exploring Train Labels Dataset
grouped_data_label=pd.DataFrame()
grouped_data_label = train_labels_df['patientId'].value_counts().value_counts().reset_index()
grouped_data_label.columns = ['Counts', 'records in train labels']
grouped_data_label.style.hide_index()
| Counts | records in train labels |
|---|---|
| 1 | 23286 |
| 2 | 3266 |
| 3 | 119 |
| 4 | 13 |
total_unique_data_labels = grouped_data_label['records in train labels'].sum()
print("Total unique records in train labels = ", total_unique_data_labels)
Total unique records in train labels = 26684
Inference:
Exploring Class Information Dataset
grouped_data_class=pd.DataFrame()
grouped_data_class = label_meta_data['patientId'].value_counts().value_counts().reset_index()
grouped_data_class.columns = ['counts', 'records in class info']
grouped_data_class.style.hide_index()
| counts | records in class info |
|---|---|
| 1 | 26684 |
total_unique_data_class= grouped_data_class['records in class info'].sum()
print("Total unique records in class info = ", total_unique_data_class)
Total unique records in class info = 26684
Merging Data
merged_df = pd.merge(left = label_meta_data, right = train_labels_df, how = 'left', on = 'patientId')
#del label_meta_data, train_labels_df
merged_df.info(null_counts = True)
merged_df.head(20)
<class 'pandas.core.frame.DataFrame'> Int64Index: 30227 entries, 0 to 30226 Data columns (total 7 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 patientId 30227 non-null object 1 class 30227 non-null object 2 x 9555 non-null float64 3 y 9555 non-null float64 4 width 9555 non-null float64 5 height 9555 non-null float64 6 Target 30227 non-null int64 dtypes: float64(4), int64(1), object(2) memory usage: 1.8+ MB
| patientId | class | x | y | width | height | Target | |
|---|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | Normal | NaN | NaN | NaN | NaN | 0 |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | Lung Opacity | 264.0 | 152.0 | 213.0 | 379.0 | 1 |
| 5 | 00436515-870c-4b36-a041-de91049b9ab4 | Lung Opacity | 562.0 | 152.0 | 256.0 | 453.0 | 1 |
| 6 | 00569f44-917d-4c86-a842-81832af98c30 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 |
| 7 | 006cec2e-6ce2-4549-bffa-eadfcd1e9970 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 |
| 8 | 00704310-78a8-4b38-8475-49f4573b2dbb | Lung Opacity | 323.0 | 577.0 | 160.0 | 104.0 | 1 |
| 9 | 00704310-78a8-4b38-8475-49f4573b2dbb | Lung Opacity | 695.0 | 575.0 | 162.0 | 137.0 | 1 |
| 10 | 008c19e8-a820-403a-930a-bc74a4053664 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 |
| 11 | 009482dc-3db5-48d4-8580-5c89c4f01334 | Normal | NaN | NaN | NaN | NaN | 0 |
| 12 | 009eb222-eabc-4150-8121-d5a6d06b8ebf | Normal | NaN | NaN | NaN | NaN | 0 |
| 13 | 00a85be6-6eb0-421d-8acf-ff2dc0007e8a | Normal | NaN | NaN | NaN | NaN | 0 |
| 14 | 00aecb01-a116-45a2-956c-08d2fa55433f | Lung Opacity | 288.0 | 322.0 | 94.0 | 135.0 | 1 |
| 15 | 00aecb01-a116-45a2-956c-08d2fa55433f | Lung Opacity | 547.0 | 299.0 | 119.0 | 165.0 | 1 |
| 16 | 00c0b293-48e7-4e16-ac76-9269ba535a62 | Lung Opacity | 306.0 | 544.0 | 168.0 | 244.0 | 1 |
| 17 | 00c0b293-48e7-4e16-ac76-9269ba535a62 | Lung Opacity | 650.0 | 511.0 | 206.0 | 284.0 | 1 |
| 18 | 00d7c36e-3cdf-4df6-ac03-6c30cdc8e85b | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 |
| 19 | 00f08de1-517e-4652-a04f-d1dc9ee48593 | Lung Opacity | 181.0 | 184.0 | 206.0 | 506.0 | 1 |
Shape of Dataset
merged_df['patientId'].nunique()
26684
merged_df['patientId'].value_counts()
0ab261f9-4eb5-42ab-a9a5-e918904d6356 4
76f71a93-8105-4c79-a010-0cfa86f0061a 4
31764d54-ea3b-434f-bae2-8c579ed13799 4
3239951b-6211-4290-b237-3d9ad17176db 4
1bf08f3b-a273-4f51-bafa-b55ada2c23b5 4
..
d60cdcd8-66ee-4e89-bbcc-cdaed2895d84 1
8332c0a4-5784-4f5d-901c-3c6350c9a732 1
ba652b03-e12d-4c42-bc2f-8a6865ecd164 1
6ebb39ce-4cd8-4c51-bf26-757a532958b4 1
ce6385e6-88c6-432a-b32f-c7018795bc32 1
Name: patientId, Length: 26684, dtype: int64
label_count=merged_df['class'].value_counts()
explode = (0.01,0.01,0.01)
fig1, ax1 = plt.subplots(figsize=(5,5))
ax1.pie(label_count.values, explode=explode, labels=label_count.index, autopct='%1.1f%%',
shadow=True, startangle=90)
ax1.axis('equal')
plt.title('Class Distribution')
plt.show()
print('Total records with Lung Opacity: ', merged_df[merged_df['class'] == 'Lung Opacity']['class'].count())
print('Total records with No Lung Opacity / Not Normal: ', merged_df[merged_df['class'] == 'No Lung Opacity / Not Normal']['class'].count())
print('Total Normal records: ', merged_df[merged_df['class'] == 'Normal']['class'].count())
Total records with Lung Opacity: 9555 Total records with No Lung Opacity / Not Normal: 11821 Total Normal records: 8851
Observation:
The above graph shows total numbers of records of different classes. From the above graph, it shows that patients with No Lung Opacity/ Not Normal are highest as compare to Lung Opacity and Normal patients.
8,851 (29.3%) records does not have any diesaes
9,555 (31.6%) records has Lung Opacity
11,821 (39.1%) records hs No Lung Opacity / Not Normal
# lets take a look at our Target Distribution
label_count=merged_df['Target'].value_counts()
explode = (0.1,0.0)
fig1, ax1 = plt.subplots(figsize=(5,5))
ax1.pie(label_count.values, explode=explode, labels=['Normal','Pneumonia'], autopct='%1.1f%%',
shadow=True, startangle=90)
ax1.axis('equal')
plt.title('Target Distribution')
plt.show()
Observation:
From the above graph, we can understand that 31.6% people got Pneumonia and 68.4% are Non-Pneumonia category.
age_list, sex_list, vp_list, studyid_list = [], [], [], []
merged_df['patientAge'] = 0
merged_df['patientSex'] = ''
merged_df['ViewPosition'] = ''
merged_df['StudyID'] = ''
counter = 0
for value in merged_df['patientId']:
counter = counter + 1
print(counter)
patient_Target = (pd.to_numeric(merged_df[merged_df['patientId'] == value]['Target'],downcast ='signed', errors='coerce')) < 1
dcm_patientFile = root_path + '/stage_2_train_images/%s.dcm' % value
dcm_patientData = pyd.read_file(dcm_patientFile)
age_list.append(dcm_patientData.PatientAge)
sex_list.append(dcm_patientData.PatientSex)
vp_list.append(dcm_patientData.ViewPosition)
studyid_list.append(dcm_patientData.StudyID)
merged_df['patientAge'] = age_list
merged_df['patientSex'] = sex_list
merged_df['ViewPosition'] = vp_list
merged_df['StudyID'] = studyid_list
merged_df.head(10)
Streaming output truncated to the last 5000 lines.
25228
25229
25230
25231
25232
25233
25234
25235
25236
25237
25238
25239
25240
25241
25242
25243
25244
25245
25246
25247
25248
25249
25250
25251
25252
25253
25254
25255
25256
25257
25258
25259
25260
25261
25262
25263
25264
25265
25266
25267
25268
25269
25270
25271
25272
25273
25274
25275
25276
25277
25278
25279
25280
25281
25282
25283
25284
25285
25286
25287
25288
25289
25290
25291
25292
25293
25294
25295
25296
25297
25298
25299
25300
25301
25302
25303
25304
25305
25306
25307
25308
25309
25310
25311
25312
25313
25314
25315
25316
25317
25318
25319
25320
25321
25322
25323
25324
25325
25326
25327
25328
25329
25330
25331
25332
25333
25334
25335
25336
25337
25338
25339
25340
25341
25342
25343
25344
25345
25346
25347
25348
25349
25350
25351
25352
25353
25354
25355
25356
25357
25358
25359
25360
25361
25362
25363
25364
25365
25366
25367
25368
25369
25370
25371
25372
25373
25374
25375
25376
25377
25378
25379
25380
25381
25382
25383
25384
25385
25386
25387
25388
25389
25390
25391
25392
25393
25394
25395
25396
25397
25398
25399
25400
25401
25402
25403
25404
25405
25406
25407
25408
25409
25410
25411
25412
25413
25414
25415
25416
25417
25418
25419
25420
25421
25422
25423
25424
25425
25426
25427
25428
25429
25430
25431
25432
25433
25434
25435
25436
25437
25438
25439
25440
25441
25442
25443
25444
25445
25446
25447
25448
25449
25450
25451
25452
25453
25454
25455
25456
25457
25458
25459
25460
25461
25462
25463
25464
25465
25466
25467
25468
25469
25470
25471
25472
25473
25474
25475
25476
25477
25478
25479
25480
25481
25482
25483
25484
25485
25486
25487
25488
25489
25490
25491
25492
25493
25494
25495
25496
25497
25498
25499
25500
25501
25502
25503
25504
25505
25506
25507
25508
25509
25510
25511
25512
25513
25514
25515
25516
25517
25518
25519
25520
25521
25522
25523
25524
25525
25526
25527
25528
25529
25530
25531
25532
25533
25534
25535
25536
25537
25538
25539
25540
25541
25542
25543
25544
25545
25546
25547
25548
25549
25550
25551
25552
25553
25554
25555
25556
25557
25558
25559
25560
25561
25562
25563
25564
25565
25566
25567
25568
25569
25570
25571
25572
25573
25574
25575
25576
25577
25578
25579
25580
25581
25582
25583
25584
25585
25586
25587
25588
25589
25590
25591
25592
25593
25594
25595
25596
25597
25598
25599
25600
25601
25602
25603
25604
25605
25606
25607
25608
25609
25610
25611
25612
25613
25614
25615
25616
25617
25618
25619
25620
25621
25622
25623
25624
25625
25626
25627
25628
25629
25630
25631
25632
25633
25634
25635
25636
25637
25638
25639
25640
25641
25642
25643
25644
25645
25646
25647
25648
25649
25650
25651
25652
25653
25654
25655
25656
25657
25658
25659
25660
25661
25662
25663
25664
25665
25666
25667
25668
25669
25670
25671
25672
25673
25674
25675
25676
25677
25678
25679
25680
25681
25682
25683
25684
25685
25686
25687
25688
25689
25690
25691
25692
25693
25694
25695
25696
25697
25698
25699
25700
25701
25702
25703
25704
25705
25706
25707
25708
25709
25710
25711
25712
25713
25714
25715
25716
25717
25718
25719
25720
25721
25722
25723
25724
25725
25726
25727
25728
25729
25730
25731
25732
25733
25734
25735
25736
25737
25738
25739
25740
25741
25742
25743
25744
25745
25746
25747
25748
25749
25750
25751
25752
25753
25754
25755
25756
25757
25758
25759
25760
25761
25762
25763
25764
25765
25766
25767
25768
25769
25770
25771
25772
25773
25774
25775
25776
25777
25778
25779
25780
25781
25782
25783
25784
25785
25786
25787
25788
25789
25790
25791
25792
25793
25794
25795
25796
25797
25798
25799
25800
25801
25802
25803
25804
25805
25806
25807
25808
25809
25810
25811
25812
25813
25814
25815
25816
25817
25818
25819
25820
25821
25822
25823
25824
25825
25826
25827
25828
25829
25830
25831
25832
25833
25834
25835
25836
25837
25838
25839
25840
25841
25842
25843
25844
25845
25846
25847
25848
25849
25850
25851
25852
25853
25854
25855
25856
25857
25858
25859
25860
25861
25862
25863
25864
25865
25866
25867
25868
25869
25870
25871
25872
25873
25874
25875
25876
25877
25878
25879
25880
25881
25882
25883
25884
25885
25886
25887
25888
25889
25890
25891
25892
25893
25894
25895
25896
25897
25898
25899
25900
25901
25902
25903
25904
25905
25906
25907
25908
25909
25910
25911
25912
25913
25914
25915
25916
25917
25918
25919
25920
25921
25922
25923
25924
25925
25926
25927
25928
25929
25930
25931
25932
25933
25934
25935
25936
25937
25938
25939
25940
25941
25942
25943
25944
25945
25946
25947
25948
25949
25950
25951
25952
25953
25954
25955
25956
25957
25958
25959
25960
25961
25962
25963
25964
25965
25966
25967
25968
25969
25970
25971
25972
25973
25974
25975
25976
25977
25978
25979
25980
25981
25982
25983
25984
25985
25986
25987
25988
25989
25990
25991
25992
25993
25994
25995
25996
25997
25998
25999
26000
26001
26002
26003
26004
26005
26006
26007
26008
26009
26010
26011
26012
26013
26014
26015
26016
26017
26018
26019
26020
26021
26022
26023
26024
26025
26026
26027
26028
26029
26030
26031
26032
26033
26034
26035
26036
26037
26038
26039
26040
26041
26042
26043
26044
26045
26046
26047
26048
26049
26050
26051
26052
26053
26054
26055
26056
26057
26058
26059
26060
26061
26062
26063
26064
26065
26066
26067
26068
26069
26070
26071
26072
26073
26074
26075
26076
26077
26078
26079
26080
26081
26082
26083
26084
26085
26086
26087
26088
26089
26090
26091
26092
26093
26094
26095
26096
26097
26098
26099
26100
26101
26102
26103
26104
26105
26106
26107
26108
26109
26110
26111
26112
26113
26114
26115
26116
26117
26118
26119
26120
26121
26122
26123
26124
26125
26126
26127
26128
26129
26130
26131
26132
26133
26134
26135
26136
26137
26138
26139
26140
26141
26142
26143
26144
26145
26146
26147
26148
26149
26150
26151
26152
26153
26154
26155
26156
26157
26158
26159
26160
26161
26162
26163
26164
26165
26166
26167
26168
26169
26170
26171
26172
26173
26174
26175
26176
26177
26178
26179
26180
26181
26182
26183
26184
26185
26186
26187
26188
26189
26190
26191
26192
26193
26194
26195
26196
26197
26198
26199
26200
26201
26202
26203
26204
26205
26206
26207
26208
26209
26210
26211
26212
26213
26214
26215
26216
26217
26218
26219
26220
26221
26222
26223
26224
26225
26226
26227
26228
26229
26230
26231
26232
26233
26234
26235
26236
26237
26238
26239
26240
26241
26242
26243
26244
26245
26246
26247
26248
26249
26250
26251
26252
26253
26254
26255
26256
26257
26258
26259
26260
26261
26262
26263
26264
26265
26266
26267
26268
26269
26270
26271
26272
26273
26274
26275
26276
26277
26278
26279
26280
26281
26282
26283
26284
26285
26286
26287
26288
26289
26290
26291
26292
26293
26294
26295
26296
26297
26298
26299
26300
26301
26302
26303
26304
26305
26306
26307
26308
26309
26310
26311
26312
26313
26314
26315
26316
26317
26318
26319
26320
26321
26322
26323
26324
26325
26326
26327
26328
26329
26330
26331
26332
26333
26334
26335
26336
26337
26338
26339
26340
26341
26342
26343
26344
26345
26346
26347
26348
26349
26350
26351
26352
26353
26354
26355
26356
26357
26358
26359
26360
26361
26362
26363
26364
26365
26366
26367
26368
26369
26370
26371
26372
26373
26374
26375
26376
26377
26378
26379
26380
26381
26382
26383
26384
26385
26386
26387
26388
26389
26390
26391
26392
26393
26394
26395
26396
26397
26398
26399
26400
26401
26402
26403
26404
26405
26406
26407
26408
26409
26410
26411
26412
26413
26414
26415
26416
26417
26418
26419
26420
26421
26422
26423
26424
26425
26426
26427
26428
26429
26430
26431
26432
26433
26434
26435
26436
26437
26438
26439
26440
26441
26442
26443
26444
26445
26446
26447
26448
26449
26450
26451
26452
26453
26454
26455
26456
26457
26458
26459
26460
26461
26462
26463
26464
26465
26466
26467
26468
26469
26470
26471
26472
26473
26474
26475
26476
26477
26478
26479
26480
26481
26482
26483
26484
26485
26486
26487
26488
26489
26490
26491
26492
26493
26494
26495
26496
26497
26498
26499
26500
26501
26502
26503
26504
26505
26506
26507
26508
26509
26510
26511
26512
26513
26514
26515
26516
26517
26518
26519
26520
26521
26522
26523
26524
26525
26526
26527
26528
26529
26530
26531
26532
26533
26534
26535
26536
26537
26538
26539
26540
26541
26542
26543
26544
26545
26546
26547
26548
26549
26550
26551
26552
26553
26554
26555
26556
26557
26558
26559
26560
26561
26562
26563
26564
26565
26566
26567
26568
26569
26570
26571
26572
26573
26574
26575
26576
26577
26578
26579
26580
26581
26582
26583
26584
26585
26586
26587
26588
26589
26590
26591
26592
26593
26594
26595
26596
26597
26598
26599
26600
26601
26602
26603
26604
26605
26606
26607
26608
26609
26610
26611
26612
26613
26614
26615
26616
26617
26618
26619
26620
26621
26622
26623
26624
26625
26626
26627
26628
26629
26630
26631
26632
26633
26634
26635
26636
26637
26638
26639
26640
26641
26642
26643
26644
26645
26646
26647
26648
26649
26650
26651
26652
26653
26654
26655
26656
26657
26658
26659
26660
26661
26662
26663
26664
26665
26666
26667
26668
26669
26670
26671
26672
26673
26674
26675
26676
26677
26678
26679
26680
26681
26682
26683
26684
26685
26686
26687
26688
26689
26690
26691
26692
26693
26694
26695
26696
26697
26698
26699
26700
26701
26702
26703
26704
26705
26706
26707
26708
26709
26710
26711
26712
26713
26714
26715
26716
26717
26718
26719
26720
26721
26722
26723
26724
26725
26726
26727
26728
26729
26730
26731
26732
26733
26734
26735
26736
26737
26738
26739
26740
26741
26742
26743
26744
26745
26746
26747
26748
26749
26750
26751
26752
26753
26754
26755
26756
26757
26758
26759
26760
26761
26762
26763
26764
26765
26766
26767
26768
26769
26770
26771
26772
26773
26774
26775
26776
26777
26778
26779
26780
26781
26782
26783
26784
26785
26786
26787
26788
26789
26790
26791
26792
26793
26794
26795
26796
26797
26798
26799
26800
26801
26802
26803
26804
26805
26806
26807
26808
26809
26810
26811
26812
26813
26814
26815
26816
26817
26818
26819
26820
26821
26822
26823
26824
26825
26826
26827
26828
26829
26830
26831
26832
26833
26834
26835
26836
26837
26838
26839
26840
26841
26842
26843
26844
26845
26846
26847
26848
26849
26850
26851
26852
26853
26854
26855
26856
26857
26858
26859
26860
26861
26862
26863
26864
26865
26866
26867
26868
26869
26870
26871
26872
26873
26874
26875
26876
26877
26878
26879
26880
26881
26882
26883
26884
26885
26886
26887
26888
26889
26890
26891
26892
26893
26894
26895
26896
26897
26898
26899
26900
26901
26902
26903
26904
26905
26906
26907
26908
26909
26910
26911
26912
26913
26914
26915
26916
26917
26918
26919
26920
26921
26922
26923
26924
26925
26926
26927
26928
26929
26930
26931
26932
26933
26934
26935
26936
26937
26938
26939
26940
26941
26942
26943
26944
26945
26946
26947
26948
26949
26950
26951
26952
26953
26954
26955
26956
26957
26958
26959
26960
26961
26962
26963
26964
26965
26966
26967
26968
26969
26970
26971
26972
26973
26974
26975
26976
26977
26978
26979
26980
26981
26982
26983
26984
26985
26986
26987
26988
26989
26990
26991
26992
26993
26994
26995
26996
26997
26998
26999
27000
27001
27002
27003
27004
27005
27006
27007
27008
27009
27010
27011
27012
27013
27014
27015
27016
27017
27018
27019
27020
27021
27022
27023
27024
27025
27026
27027
27028
27029
27030
27031
27032
27033
27034
27035
27036
27037
27038
27039
27040
27041
27042
27043
27044
27045
27046
27047
27048
27049
27050
27051
27052
27053
27054
27055
27056
27057
27058
27059
27060
27061
27062
27063
27064
27065
27066
27067
27068
27069
27070
27071
27072
27073
27074
27075
27076
27077
27078
27079
27080
27081
27082
27083
27084
27085
27086
27087
27088
27089
27090
27091
27092
27093
27094
27095
27096
27097
27098
27099
27100
27101
27102
27103
27104
27105
27106
27107
27108
27109
27110
27111
27112
27113
27114
27115
27116
27117
27118
27119
27120
27121
27122
27123
27124
27125
27126
27127
27128
27129
27130
27131
27132
27133
27134
27135
27136
27137
27138
27139
27140
27141
27142
27143
27144
27145
27146
27147
27148
27149
27150
27151
27152
27153
27154
27155
27156
27157
27158
27159
27160
27161
27162
27163
27164
27165
27166
27167
27168
27169
27170
27171
27172
27173
27174
27175
27176
27177
27178
27179
27180
27181
27182
27183
27184
27185
27186
27187
27188
27189
27190
27191
27192
27193
27194
27195
27196
27197
27198
27199
27200
27201
27202
27203
27204
27205
27206
27207
27208
27209
27210
27211
27212
27213
27214
27215
27216
27217
27218
27219
27220
27221
27222
27223
27224
27225
27226
27227
27228
27229
27230
27231
27232
27233
27234
27235
27236
27237
27238
27239
27240
27241
27242
27243
27244
27245
27246
27247
27248
27249
27250
27251
27252
27253
27254
27255
27256
27257
27258
27259
27260
27261
27262
27263
27264
27265
27266
27267
27268
27269
27270
27271
27272
27273
27274
27275
27276
27277
27278
27279
27280
27281
27282
27283
27284
27285
27286
27287
27288
27289
27290
27291
27292
27293
27294
27295
27296
27297
27298
27299
27300
27301
27302
27303
27304
27305
27306
27307
27308
27309
27310
27311
27312
27313
27314
27315
27316
27317
27318
27319
27320
27321
27322
27323
27324
27325
27326
27327
27328
27329
27330
27331
27332
27333
27334
27335
27336
27337
27338
27339
27340
27341
27342
27343
27344
27345
27346
27347
27348
27349
27350
27351
27352
27353
27354
27355
27356
27357
27358
27359
27360
27361
27362
27363
27364
27365
27366
27367
27368
27369
27370
27371
27372
27373
27374
27375
27376
27377
27378
27379
27380
27381
27382
27383
27384
27385
27386
27387
27388
27389
27390
27391
27392
27393
27394
27395
27396
27397
27398
27399
27400
27401
27402
27403
27404
27405
27406
27407
27408
27409
27410
27411
27412
27413
27414
27415
27416
27417
27418
27419
27420
27421
27422
27423
27424
27425
27426
27427
27428
27429
27430
27431
27432
27433
27434
27435
27436
27437
27438
27439
27440
27441
27442
27443
27444
27445
27446
27447
27448
27449
27450
27451
27452
27453
27454
27455
27456
27457
27458
27459
27460
27461
27462
27463
27464
27465
27466
27467
27468
27469
27470
27471
27472
27473
27474
27475
27476
27477
27478
27479
27480
27481
27482
27483
27484
27485
27486
27487
27488
27489
27490
27491
27492
27493
27494
27495
27496
27497
27498
27499
27500
27501
27502
27503
27504
27505
27506
27507
27508
27509
27510
27511
27512
27513
27514
27515
27516
27517
27518
27519
27520
27521
27522
27523
27524
27525
27526
27527
27528
27529
27530
27531
27532
27533
27534
27535
27536
27537
27538
27539
27540
27541
27542
27543
27544
27545
27546
27547
27548
27549
27550
27551
27552
27553
27554
27555
27556
27557
27558
27559
27560
27561
27562
27563
27564
27565
27566
27567
27568
27569
27570
27571
27572
27573
27574
27575
27576
27577
27578
27579
27580
27581
27582
27583
27584
27585
27586
27587
27588
27589
27590
27591
27592
27593
27594
27595
27596
27597
27598
27599
27600
27601
27602
27603
27604
27605
27606
27607
27608
27609
27610
27611
27612
27613
27614
27615
27616
27617
27618
27619
27620
27621
27622
27623
27624
27625
27626
27627
27628
27629
27630
27631
27632
27633
27634
27635
27636
27637
27638
27639
27640
27641
27642
27643
27644
27645
27646
27647
27648
27649
27650
27651
27652
27653
27654
27655
27656
27657
27658
27659
27660
27661
27662
27663
27664
27665
27666
27667
27668
27669
27670
27671
27672
27673
27674
27675
27676
27677
27678
27679
27680
27681
27682
27683
27684
27685
27686
27687
27688
27689
27690
27691
27692
27693
27694
27695
27696
27697
27698
27699
27700
27701
27702
27703
27704
27705
27706
27707
27708
27709
27710
27711
27712
27713
27714
27715
27716
27717
27718
27719
27720
27721
27722
27723
27724
27725
27726
27727
27728
27729
27730
27731
27732
27733
27734
27735
27736
27737
27738
27739
27740
27741
27742
27743
27744
27745
27746
27747
27748
27749
27750
27751
27752
27753
27754
27755
27756
27757
27758
27759
27760
27761
27762
27763
27764
27765
27766
27767
27768
27769
27770
27771
27772
27773
27774
27775
27776
27777
27778
27779
27780
27781
27782
27783
27784
27785
27786
27787
27788
27789
27790
27791
27792
27793
27794
27795
27796
27797
27798
27799
27800
27801
27802
27803
27804
27805
27806
27807
27808
27809
27810
27811
27812
27813
27814
27815
27816
27817
27818
27819
27820
27821
27822
27823
27824
27825
27826
27827
27828
27829
27830
27831
27832
27833
27834
27835
27836
27837
27838
27839
27840
27841
27842
27843
27844
27845
27846
27847
27848
27849
27850
27851
27852
27853
27854
27855
27856
27857
27858
27859
27860
27861
27862
27863
27864
27865
27866
27867
27868
27869
27870
27871
27872
27873
27874
27875
27876
27877
27878
27879
27880
27881
27882
27883
27884
27885
27886
27887
27888
27889
27890
27891
27892
27893
27894
27895
27896
27897
27898
27899
27900
27901
27902
27903
27904
27905
27906
27907
27908
27909
27910
27911
27912
27913
27914
27915
27916
27917
27918
27919
27920
27921
27922
27923
27924
27925
27926
27927
27928
27929
27930
27931
27932
27933
27934
27935
27936
27937
27938
27939
27940
27941
27942
27943
27944
27945
27946
27947
27948
27949
27950
27951
27952
27953
27954
27955
27956
27957
27958
27959
27960
27961
27962
27963
27964
27965
27966
27967
27968
27969
27970
27971
27972
27973
27974
27975
27976
27977
27978
27979
27980
27981
27982
27983
27984
27985
27986
27987
27988
27989
27990
27991
27992
27993
27994
27995
27996
27997
27998
27999
28000
28001
28002
28003
28004
28005
28006
28007
28008
28009
28010
28011
28012
28013
28014
28015
28016
28017
28018
28019
28020
28021
28022
28023
28024
28025
28026
28027
28028
28029
28030
28031
28032
28033
28034
28035
28036
28037
28038
28039
28040
28041
28042
28043
28044
28045
28046
28047
28048
28049
28050
28051
28052
28053
28054
28055
28056
28057
28058
28059
28060
28061
28062
28063
28064
28065
28066
28067
28068
28069
28070
28071
28072
28073
28074
28075
28076
28077
28078
28079
28080
28081
28082
28083
28084
28085
28086
28087
28088
28089
28090
28091
28092
28093
28094
28095
28096
28097
28098
28099
28100
28101
28102
28103
28104
28105
28106
28107
28108
28109
28110
28111
28112
28113
28114
28115
28116
28117
28118
28119
28120
28121
28122
28123
28124
28125
28126
28127
28128
28129
28130
28131
28132
28133
28134
28135
28136
28137
28138
28139
28140
28141
28142
28143
28144
28145
28146
28147
28148
28149
28150
28151
28152
28153
28154
28155
28156
28157
28158
28159
28160
28161
28162
28163
28164
28165
28166
28167
28168
28169
28170
28171
28172
28173
28174
28175
28176
28177
28178
28179
28180
28181
28182
28183
28184
28185
28186
28187
28188
28189
28190
28191
28192
28193
28194
28195
28196
28197
28198
28199
28200
28201
28202
28203
28204
28205
28206
28207
28208
28209
28210
28211
28212
28213
28214
28215
28216
28217
28218
28219
28220
28221
28222
28223
28224
28225
28226
28227
28228
28229
28230
28231
28232
28233
28234
28235
28236
28237
28238
28239
28240
28241
28242
28243
28244
28245
28246
28247
28248
28249
28250
28251
28252
28253
28254
28255
28256
28257
28258
28259
28260
28261
28262
28263
28264
28265
28266
28267
28268
28269
28270
28271
28272
28273
28274
28275
28276
28277
28278
28279
28280
28281
28282
28283
28284
28285
28286
28287
28288
28289
28290
28291
28292
28293
28294
28295
28296
28297
28298
28299
28300
28301
28302
28303
28304
28305
28306
28307
28308
28309
28310
28311
28312
28313
28314
28315
28316
28317
28318
28319
28320
28321
28322
28323
28324
28325
28326
28327
28328
28329
28330
28331
28332
28333
28334
28335
28336
28337
28338
28339
28340
28341
28342
28343
28344
28345
28346
28347
28348
28349
28350
28351
28352
28353
28354
28355
28356
28357
28358
28359
28360
28361
28362
28363
28364
28365
28366
28367
28368
28369
28370
28371
28372
28373
28374
28375
28376
28377
28378
28379
28380
28381
28382
28383
28384
28385
28386
28387
28388
28389
28390
28391
28392
28393
28394
28395
28396
28397
28398
28399
28400
28401
28402
28403
28404
28405
28406
28407
28408
28409
28410
28411
28412
28413
28414
28415
28416
28417
28418
28419
28420
28421
28422
28423
28424
28425
28426
28427
28428
28429
28430
28431
28432
28433
28434
28435
28436
28437
28438
28439
28440
28441
28442
28443
28444
28445
28446
28447
28448
28449
28450
28451
28452
28453
28454
28455
28456
28457
28458
28459
28460
28461
28462
28463
28464
28465
28466
28467
28468
28469
28470
28471
28472
28473
28474
28475
28476
28477
28478
28479
28480
28481
28482
28483
28484
28485
28486
28487
28488
28489
28490
28491
28492
28493
28494
28495
28496
28497
28498
28499
28500
28501
28502
28503
28504
28505
28506
28507
28508
28509
28510
28511
28512
28513
28514
28515
28516
28517
28518
28519
28520
28521
28522
28523
28524
28525
28526
28527
28528
28529
28530
28531
28532
28533
28534
28535
28536
28537
28538
28539
28540
28541
28542
28543
28544
28545
28546
28547
28548
28549
28550
28551
28552
28553
28554
28555
28556
28557
28558
28559
28560
28561
28562
28563
28564
28565
28566
28567
28568
28569
28570
28571
28572
28573
28574
28575
28576
28577
28578
28579
28580
28581
28582
28583
28584
28585
28586
28587
28588
28589
28590
28591
28592
28593
28594
28595
28596
28597
28598
28599
28600
28601
28602
28603
28604
28605
28606
28607
28608
28609
28610
28611
28612
28613
28614
28615
28616
28617
28618
28619
28620
28621
28622
28623
28624
28625
28626
28627
28628
28629
28630
28631
28632
28633
28634
28635
28636
28637
28638
28639
28640
28641
28642
28643
28644
28645
28646
28647
28648
28649
28650
28651
28652
28653
28654
28655
28656
28657
28658
28659
28660
28661
28662
28663
28664
28665
28666
28667
28668
28669
28670
28671
28672
28673
28674
28675
28676
28677
28678
28679
28680
28681
28682
28683
28684
28685
28686
28687
28688
28689
28690
28691
28692
28693
28694
28695
28696
28697
28698
28699
28700
28701
28702
28703
28704
28705
28706
28707
28708
28709
28710
28711
28712
28713
28714
28715
28716
28717
28718
28719
28720
28721
28722
28723
28724
28725
28726
28727
28728
28729
28730
28731
28732
28733
28734
28735
28736
28737
28738
28739
28740
28741
28742
28743
28744
28745
28746
28747
28748
28749
28750
28751
28752
28753
28754
28755
28756
28757
28758
28759
28760
28761
28762
28763
28764
28765
28766
28767
28768
28769
28770
28771
28772
28773
28774
28775
28776
28777
28778
28779
28780
28781
28782
28783
28784
28785
28786
28787
28788
28789
28790
28791
28792
28793
28794
28795
28796
28797
28798
28799
28800
28801
28802
28803
28804
28805
28806
28807
28808
28809
28810
28811
28812
28813
28814
28815
28816
28817
28818
28819
28820
28821
28822
28823
28824
28825
28826
28827
28828
28829
28830
28831
28832
28833
28834
28835
28836
28837
28838
28839
28840
28841
28842
28843
28844
28845
28846
28847
28848
28849
28850
28851
28852
28853
28854
28855
28856
28857
28858
28859
28860
28861
28862
28863
28864
28865
28866
28867
28868
28869
28870
28871
28872
28873
28874
28875
28876
28877
28878
28879
28880
28881
28882
28883
28884
28885
28886
28887
28888
28889
28890
28891
28892
28893
28894
28895
28896
28897
28898
28899
28900
28901
28902
28903
28904
28905
28906
28907
28908
28909
28910
28911
28912
28913
28914
28915
28916
28917
28918
28919
28920
28921
28922
28923
28924
28925
28926
28927
28928
28929
28930
28931
28932
28933
28934
28935
28936
28937
28938
28939
28940
28941
28942
28943
28944
28945
28946
28947
28948
28949
28950
28951
28952
28953
28954
28955
28956
28957
28958
28959
28960
28961
28962
28963
28964
28965
28966
28967
28968
28969
28970
28971
28972
28973
28974
28975
28976
28977
28978
28979
28980
28981
28982
28983
28984
28985
28986
28987
28988
28989
28990
28991
28992
28993
28994
28995
28996
28997
28998
28999
29000
29001
29002
29003
29004
29005
29006
29007
29008
29009
29010
29011
29012
29013
29014
29015
29016
29017
29018
29019
29020
29021
29022
29023
29024
29025
29026
29027
29028
29029
29030
29031
29032
29033
29034
29035
29036
29037
29038
29039
29040
29041
29042
29043
29044
29045
29046
29047
29048
29049
29050
29051
29052
29053
29054
29055
29056
29057
29058
29059
29060
29061
29062
29063
29064
29065
29066
29067
29068
29069
29070
29071
29072
29073
29074
29075
29076
29077
29078
29079
29080
29081
29082
29083
29084
29085
29086
29087
29088
29089
29090
29091
29092
29093
29094
29095
29096
29097
29098
29099
29100
29101
29102
29103
29104
29105
29106
29107
29108
29109
29110
29111
29112
29113
29114
29115
29116
29117
29118
29119
29120
29121
29122
29123
29124
29125
29126
29127
29128
29129
29130
29131
29132
29133
29134
29135
29136
29137
29138
29139
29140
29141
29142
29143
29144
29145
29146
29147
29148
29149
29150
29151
29152
29153
29154
29155
29156
29157
29158
29159
29160
29161
29162
29163
29164
29165
29166
29167
29168
29169
29170
29171
29172
29173
29174
29175
29176
29177
29178
29179
29180
29181
29182
29183
29184
29185
29186
29187
29188
29189
29190
29191
29192
29193
29194
29195
29196
29197
29198
29199
29200
29201
29202
29203
29204
29205
29206
29207
29208
29209
29210
29211
29212
29213
29214
29215
29216
29217
29218
29219
29220
29221
29222
29223
29224
29225
29226
29227
29228
29229
29230
29231
29232
29233
29234
29235
29236
29237
29238
29239
29240
29241
29242
29243
29244
29245
29246
29247
29248
29249
29250
29251
29252
29253
29254
29255
29256
29257
29258
29259
29260
29261
29262
29263
29264
29265
29266
29267
29268
29269
29270
29271
29272
29273
29274
29275
29276
29277
29278
29279
29280
29281
29282
29283
29284
29285
29286
29287
29288
29289
29290
29291
29292
29293
29294
29295
29296
29297
29298
29299
29300
29301
29302
29303
29304
29305
29306
29307
29308
29309
29310
29311
29312
29313
29314
29315
29316
29317
29318
29319
29320
29321
29322
29323
29324
29325
29326
29327
29328
29329
29330
29331
29332
29333
29334
29335
29336
29337
29338
29339
29340
29341
29342
29343
29344
29345
29346
29347
29348
29349
29350
29351
29352
29353
29354
29355
29356
29357
29358
29359
29360
29361
29362
29363
29364
29365
29366
29367
29368
29369
29370
29371
29372
29373
29374
29375
29376
29377
29378
29379
29380
29381
29382
29383
29384
29385
29386
29387
29388
29389
29390
29391
29392
29393
29394
29395
29396
29397
29398
29399
29400
29401
29402
29403
29404
29405
29406
29407
29408
29409
29410
29411
29412
29413
29414
29415
29416
29417
29418
29419
29420
29421
29422
29423
29424
29425
29426
29427
29428
29429
29430
29431
29432
29433
29434
29435
29436
29437
29438
29439
29440
29441
29442
29443
29444
29445
29446
29447
29448
29449
29450
29451
29452
29453
29454
29455
29456
29457
29458
29459
29460
29461
29462
29463
29464
29465
29466
29467
29468
29469
29470
29471
29472
29473
29474
29475
29476
29477
29478
29479
29480
29481
29482
29483
29484
29485
29486
29487
29488
29489
29490
29491
29492
29493
29494
29495
29496
29497
29498
29499
29500
29501
29502
29503
29504
29505
29506
29507
29508
29509
29510
29511
29512
29513
29514
29515
29516
29517
29518
29519
29520
29521
29522
29523
29524
29525
29526
29527
29528
29529
29530
29531
29532
29533
29534
29535
29536
29537
29538
29539
29540
29541
29542
29543
29544
29545
29546
29547
29548
29549
29550
29551
29552
29553
29554
29555
29556
29557
29558
29559
29560
29561
29562
29563
29564
29565
29566
29567
29568
29569
29570
29571
29572
29573
29574
29575
29576
29577
29578
29579
29580
29581
29582
29583
29584
29585
29586
29587
29588
29589
29590
29591
29592
29593
29594
29595
29596
29597
29598
29599
29600
29601
29602
29603
29604
29605
29606
29607
29608
29609
29610
29611
29612
29613
29614
29615
29616
29617
29618
29619
29620
29621
29622
29623
29624
29625
29626
29627
29628
29629
29630
29631
29632
29633
29634
29635
29636
29637
29638
29639
29640
29641
29642
29643
29644
29645
29646
29647
29648
29649
29650
29651
29652
29653
29654
29655
29656
29657
29658
29659
29660
29661
29662
29663
29664
29665
29666
29667
29668
29669
29670
29671
29672
29673
29674
29675
29676
29677
29678
29679
29680
29681
29682
29683
29684
29685
29686
29687
29688
29689
29690
29691
29692
29693
29694
29695
29696
29697
29698
29699
29700
29701
29702
29703
29704
29705
29706
29707
29708
29709
29710
29711
29712
29713
29714
29715
29716
29717
29718
29719
29720
29721
29722
29723
29724
29725
29726
29727
29728
29729
29730
29731
29732
29733
29734
29735
29736
29737
29738
29739
29740
29741
29742
29743
29744
29745
29746
29747
29748
29749
29750
29751
29752
29753
29754
29755
29756
29757
29758
29759
29760
29761
29762
29763
29764
29765
29766
29767
29768
29769
29770
29771
29772
29773
29774
29775
29776
29777
29778
29779
29780
29781
29782
29783
29784
29785
29786
29787
29788
29789
29790
29791
29792
29793
29794
29795
29796
29797
29798
29799
29800
29801
29802
29803
29804
29805
29806
29807
29808
29809
29810
29811
29812
29813
29814
29815
29816
29817
29818
29819
29820
29821
29822
29823
29824
29825
29826
29827
29828
29829
29830
29831
29832
29833
29834
29835
29836
29837
29838
29839
29840
29841
29842
29843
29844
29845
29846
29847
29848
29849
29850
29851
29852
29853
29854
29855
29856
29857
29858
29859
29860
29861
29862
29863
29864
29865
29866
29867
29868
29869
29870
29871
29872
29873
29874
29875
29876
29877
29878
29879
29880
29881
29882
29883
29884
29885
29886
29887
29888
29889
29890
29891
29892
29893
29894
29895
29896
29897
29898
29899
29900
29901
29902
29903
29904
29905
29906
29907
29908
29909
29910
29911
29912
29913
29914
29915
29916
29917
29918
29919
29920
29921
29922
29923
29924
29925
29926
29927
29928
29929
29930
29931
29932
29933
29934
29935
29936
29937
29938
29939
29940
29941
29942
29943
29944
29945
29946
29947
29948
29949
29950
29951
29952
29953
29954
29955
29956
29957
29958
29959
29960
29961
29962
29963
29964
29965
29966
29967
29968
29969
29970
29971
29972
29973
29974
29975
29976
29977
29978
29979
29980
29981
29982
29983
29984
29985
29986
29987
29988
29989
29990
29991
29992
29993
29994
29995
29996
29997
29998
29999
30000
30001
30002
30003
30004
30005
30006
30007
30008
30009
30010
30011
30012
30013
30014
30015
30016
30017
30018
30019
30020
30021
30022
30023
30024
30025
30026
30027
30028
30029
30030
30031
30032
30033
30034
30035
30036
30037
30038
30039
30040
30041
30042
30043
30044
30045
30046
30047
30048
30049
30050
30051
30052
30053
30054
30055
30056
30057
30058
30059
30060
30061
30062
30063
30064
30065
30066
30067
30068
30069
30070
30071
30072
30073
30074
30075
30076
30077
30078
30079
30080
30081
30082
30083
30084
30085
30086
30087
30088
30089
30090
30091
30092
30093
30094
30095
30096
30097
30098
30099
30100
30101
30102
30103
30104
30105
30106
30107
30108
30109
30110
30111
30112
30113
30114
30115
30116
30117
30118
30119
30120
30121
30122
30123
30124
30125
30126
30127
30128
30129
30130
30131
30132
30133
30134
30135
30136
30137
30138
30139
30140
30141
30142
30143
30144
30145
30146
30147
30148
30149
30150
30151
30152
30153
30154
30155
30156
30157
30158
30159
30160
30161
30162
30163
30164
30165
30166
30167
30168
30169
30170
30171
30172
30173
30174
30175
30176
30177
30178
30179
30180
30181
30182
30183
30184
30185
30186
30187
30188
30189
30190
30191
30192
30193
30194
30195
30196
30197
30198
30199
30200
30201
30202
30203
30204
30205
30206
30207
30208
30209
30210
30211
30212
30213
30214
30215
30216
30217
30218
30219
30220
30221
30222
30223
30224
30225
30226
30227
| patientId | class | x | y | width | height | Target | patientAge | patientSex | ViewPosition | StudyID | |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 51 | F | PA | |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 48 | F | PA | |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 19 | M | AP | |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | Normal | NaN | NaN | NaN | NaN | 0 | 28 | M | PA | |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | Lung Opacity | 264.0 | 152.0 | 213.0 | 379.0 | 1 | 32 | F | AP | |
| 5 | 00436515-870c-4b36-a041-de91049b9ab4 | Lung Opacity | 562.0 | 152.0 | 256.0 | 453.0 | 1 | 32 | F | AP | |
| 6 | 00569f44-917d-4c86-a842-81832af98c30 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 54 | M | AP | |
| 7 | 006cec2e-6ce2-4549-bffa-eadfcd1e9970 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 78 | M | PA | |
| 8 | 00704310-78a8-4b38-8475-49f4573b2dbb | Lung Opacity | 323.0 | 577.0 | 160.0 | 104.0 | 1 | 75 | M | PA | |
| 9 | 00704310-78a8-4b38-8475-49f4573b2dbb | Lung Opacity | 695.0 | 575.0 | 162.0 | 137.0 | 1 | 75 | M | PA |
fig, axes = plt.subplots(1, 1, figsize=(7, 7))
pneumonia_on_age = sns.distplot(merged_df[merged_df['Target']==1]['patientAge'], hist=True, kde=False, color='green', label='Target 1')
pneumonia_count = pneumonia_on_age.set_ylabel('Count')
pneumonia_count = pneumonia_on_age.set_title('Patient Age vs pneumonia case count')
/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms). warnings.warn(msg, FutureWarning)
Observation:
Normal Data Distribution
n = sns.distplot(merged_df[merged_df['class']=='Normal']['patientAge'], hist=True, color='blue', kde=False)
normal = n.set_ylabel('Count - Normal')
normal = n.set_xlabel('PatientAge - Normal')
/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms). warnings.warn(msg, FutureWarning)
From the above graph, age between 40-60 shows more normal
No Lung Opacity / Not Normal Data Distribution
not_normal = sns.distplot(merged_df[merged_df['class']=='No Lung Opacity / Not Normal']['patientAge'], hist=True, color='green',kde=False)
not_normal_data = not_normal.set_ylabel('Count')
not_normal_data = not_normal.set_xlabel('PatientAge - No Lung Opacity / Not Normal')
/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms). warnings.warn(msg, FutureWarning)
Observation: Above graph shows that the patient's age between 40-60 have more number of cases.
Lung Opacity Data Distribution
lung_opacity = sns.distplot(merged_df[merged_df['class']=='Lung Opacity']['patientAge'], hist=True, color='red',kde=False)
lung_opacity_data = lung_opacity.set_ylabel('Count')
lung_opacity_data = lung_opacity.set_xlabel('patientAge - Lung Opacity')
/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms). warnings.warn(msg, FutureWarning)
Observation:
From the above graph, clearly visisble that there are more patients between age 40-60 age.
fig, axes = plt.subplots(1, 1, figsize=(7, 7))
gender = sns.countplot(x='Target', hue='patientSex', data=merged_df)
gender_data = gender.set_title('Patient Gender')
fig, axes = plt.subplots(1, 1, figsize=(7, 7))
gender_class = sns.countplot(x='patientSex', hue='class', data=merged_df)
gender_class_data = gender_class.set_title('Gender - Class')
Observation:
Male patients have more records for different classes.
fig, axes = plt.subplots(1, 1, figsize=(20, 7))
gender_comp = sns.countplot(x='patientAge', hue='patientSex', data=merged_df, order=merged_df['patientAge'].value_counts().index)
gender_data = gender_comp.set_title('Patient Gender and Age')
target_data = merged_df[merged_df['Target']==1]
print(target_data.shape)
target_sample = target_data.sample(9555)
target_sample['xc'] = target_sample['x'] + target_sample['width'] / 2
target_sample['yc'] = target_sample['y'] + target_sample['height'] / 2
(9555, 11)
target_sample['patientAge'] = target_sample['patientAge'].astype(int)
target_age1 = target_sample[target_sample['patientAge'] < 20]
target_age2 = target_sample[(target_sample['patientAge'] >=20) & (target_sample['patientAge'] < 35)]
target_age3 = target_sample[(target_sample['patientAge'] >=35) & (target_sample['patientAge'] < 50)]
target_age4 = target_sample[(target_sample['patientAge'] >=50) & (target_sample['patientAge'] < 65)]
target_age5 = target_sample[target_sample['patientAge'] >= 65]
def plot_data(data,color_point, color_window,text):
fig, ax = plt.subplots(1,1,figsize=(7,7))
plt.title("Centers of Lung Opacity rectangles over rectangles\n{}".format(text))
data.plot.scatter(x='xc', y='yc', xlim=(0,1024), ylim=(0,1024), ax=ax, alpha=0.8, marker=".", color=color_point)
for i, crt_sample in data.iterrows():
ax.add_patch(Rectangle(xy=(crt_sample['x'], crt_sample['y']),
width=crt_sample['width'],height=crt_sample['height'],alpha=3.5e-3, color=color_window))
plt.show()
plot_data(target_age1,'blue', 'Orange', 'Patient Age: 1-19 years')
plot_data(target_age2,'blue', 'Orange', 'Patient Age: 20-34 years')
plot_data(target_age3,'blue', 'orange', 'Patient Age: 35-49 years')
plot_data(target_age4,'blue', 'orange', 'Patient Age: 50-64 years')
plot_data(target_age5,'blue', 'orange', 'Patient Age: >65 years')
Observation:
From the above graphs, we can clearly understands that patients age between 50 to 65 has more cases of Pneumonia.
Class
fig, axes = plt.subplots(1, 1, figsize=(7, 7))
view_position_class = sns.countplot(x='ViewPosition', hue='class', data=merged_df)
view_position_class_data = view_position_class.set_title('View Positon Vs Class')
Observation:
Patient's with AP view position has more number of records.
Target
fig, axes = plt.subplots(1, 1, figsize=(7, 7))
view_position_target = sns.countplot(x='Target', hue='ViewPosition', data=merged_df)
view_position_target_data = view_position_target.set_title('View Position Vs Target')
Observation:
Patient with AP view position has more number of records then PA.
Gender Wise
fig, axes = plt.subplots(1, 1, figsize=(7, 7))
view_position_gender = sns.countplot(x='ViewPosition', hue='patientSex', data=merged_df)
view_position_gender_data = view_position_gender.set_title('View Position Vs Patient Sex')
Observation:
Patient with AP view position has more records then PA.
Age wise
fig, axes = plt.subplots(1, 1, figsize=(7, 7))
age_data = sns.distplot(merged_df[merged_df['ViewPosition']=='AP']['patientAge'], hist=True, kde=False, color='red', label='AP')
age_data = sns.distplot(merged_df[merged_df['ViewPosition']=='PA']['patientAge'], hist=True, kde=False, color='green', label='PA')
view_position_age_count_ = age_data.set_ylabel('Count')
view_position_age_count_ = age_data.legend()
view_position_age_count_ = age_data.set_title('View Position Vs Age')
/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms). warnings.warn(msg, FutureWarning)
Observation:
It clearly shows AP view position has more numbers of records then PA.
# Function to Read DCM Image
def read_image(patientId):
train_fp = root_path + '/stage_2_train_images/%s.dcm' % patientId
dcm = pydicom.read_file(train_fp)
return dcm
def image_grid(df, pid_sample_list, nrows=3, ncols=3, draw_bbox=True, ax_off=True):
fig = plt.figure(figsize=(16, 12))
for i in range(nrows * ncols):
patient_id = pid_sample_list[i]
img = read_image(patient_id).pixel_array
ax = fig.add_subplot(nrows, ncols, i + 1)
plt.imshow(img, cmap='gray')
ax.set_title(patient_id)
if ax_off:
ax.set_axis_off()
if draw_bbox:
bbox_rows = merged_df[merged_df['patientId'] == patient_id]
for _, row in bbox_rows.iterrows():
x, y = row['x'], row['y']
width, height = row['width'], row['height']
bbox = patches.Rectangle((x, y), width, height, linewidth=.5, edgecolor='r', facecolor='none')
ax.add_patch(bbox)
plt.tight_layout()
plt.subplots_adjust(wspace=.01, hspace=.01)
return fig
# DICOM image with class label "No Lung Opacity / Not Normal"
print('Label: No Lung Opacity / Not Normal')
print(label_meta_data['patientId'][0])
filename = label_meta_data['patientId'][0] + '.dcm'
filename = (os.path.join(root_path,'stage_2_train_images',filename))
dataset = pyd.dcmread(filename)
plt.imshow(dataset.pixel_array, cmap=plt.cm.bone)
plt.show()
Label: No Lung Opacity / Not Normal 0004cfab-14fd-4e49-80ba-63a80b6bddd6
# DICOM image with class label "Normal"
print('Label: Normal')
print(label_meta_data['patientId'][3])
filename = label_meta_data['patientId'][3] + '.dcm'
filename = (os.path.join(root_path,'stage_2_train_images',filename))
dataset = pyd.dcmread(filename)
plt.imshow(dataset.pixel_array, cmap=plt.cm.bone)
plt.show()
Label: Normal 003d8fa0-6bf1-40ed-b54c-ac657f8495c5
# DICOM image with class label "Lung Opacity"
from matplotlib.patches import Rectangle
print('Label: Lung Opacity')
print(train_labels_df['patientId'][4])
filename = train_labels_df['patientId'][4] + '.dcm'
filename = (os.path.join(root_path,'stage_2_train_images',filename))
dataset = pyd.dcmread(filename)
plt.imshow(dataset.pixel_array, cmap=plt.cm.bone)
bb = Rectangle((train_labels_df['x'][4], train_labels_df['y'][4]), train_labels_df['width'][4], train_labels_df['height'][4], fill=False, color='red')
plt.axes().add_patch(bb)
plt.show()
Label: Lung Opacity 00436515-870c-4b36-a041-de91049b9ab4
/usr/local/lib/python3.7/dist-packages/ipykernel_launcher.py:17: MatplotlibDeprecationWarning: Adding an axes using the same arguments as a previous axes currently reuses the earlier instance. In a future version, a new instance will always be created and returned. Meanwhile, this warning can be suppressed, and the future behavior ensured, by passing a unique label to each axes instance.
import pydicom
patient_ids = merged_df[merged_df['ViewPosition']=='PA']['patientId'].sample(20).tolist()
patient_ids_grid = image_grid(merged_df, patient_ids, nrows=2, ncols=3)
patient_ids = merged_df[merged_df['ViewPosition']=='AP']['patientId'].sample(20).tolist()
patient_ids_grid = image_grid(merged_df, patient_ids, nrows=2, ncols=3)
from functools import partial
def build_bbox_arrays_id(df):
arrayConstructor = partial(np.zeros, shape=(1024,1024), dtype=np.uint8)
arrays = defaultdict(arrayConstructor)
for idx, row in df.iterrows():
patient_id = row['patientId']
x, y = int(row['x']), int(row['y'])
width, height = int(row['width']), int(row['height'])
array = arrays[patient_id]
array[y: y + height, x: x + width] += 1
return arrays
bbox_arrays = build_bbox_arrays_id(merged_df[merged_df['Target']==1])
df_labels = merged_df.copy(deep=True)
bbox_counts = df_labels.groupby('patientId')['Target'].sum()
df_labels.index = df_labels.patientId
df_labels['bbox_counts'] = bbox_counts
df_labels = df_labels.reset_index(drop=True)
merged_df['patientAge'] = merged_df['patientAge'].astype(int)
patient_bbox_ids = {
'pa': set(merged_df['patientId'][merged_df['ViewPosition']=='PA'].dropna().unique()),
'ap': set(merged_df['patientId'][merged_df['ViewPosition']=='AP'].dropna().unique()),
'bbox_4': set(df_labels['patientId'][df_labels['bbox_counts']==4].dropna().unique()),
'bbox_3': set(df_labels['patientId'][df_labels['bbox_counts']==3].dropna().unique()),
'bbox_2': set(df_labels['patientId'][df_labels['bbox_counts']==2].dropna().unique()),
'bbox_1': set(df_labels['patientId'][df_labels['bbox_counts']==1].dropna().unique()),
'f': set(merged_df['patientId'][merged_df['patientSex']=='F'].dropna().unique()),
'm': set(merged_df['patientId'][merged_df['patientSex']=='M'].dropna().unique()),
'age_above_60': set(merged_df['patientId'][merged_df['patientAge'] > 60].dropna().unique()),
'age_40_to_60': set(merged_df['patientId'][(merged_df['patientAge'] <= 60) & (merged_df['patientAge'] >= 40)].dropna().unique()),
'age_below_40': set(merged_df['patientId'][merged_df['patientAge'] < 40].dropna().unique()),
}
def_array = partial(np.zeros, shape=(1024,1024), dtype=np.uint32)
bbox_groups = defaultdict(def_array)
bbox_groups['all'] = np.zeros(shape=(1024,1024), dtype=np.uint32)
for patient_id, bbox_array in bbox_arrays.items():
# add to all group
bbox_groups['all'] += bbox_array
# add to each other group where id is in that group's id set
for group, id_set in patient_bbox_ids.items():
if patient_id in id_set:
bbox_groups[group] += bbox_array
def graph_density(array, ax, title, n_countour_levels=3):
contour_set = ax.contour(
np.arange(0, 1024, 1),
np.arange(1024, 0, -1),
array,
n_countour_levels,
linewidths=.5,
colors='black'
)
plt.clabel(contour_set, inline=True, fontsize=10, fmt='%.0f')
im = ax.imshow(
array,
extent=[0, 1024, 0, 1024],
origin='upper',
cmap='viridis',
alpha=.8
)
plt.colorbar(im, ax=ax)
ax.set_title(title)
return im
fig, axes = plt.subplots(1, 1, figsize=(14, 6), sharex=True)
bbox_all_targets = graph_density(bbox_groups['all'], axes, 'All Targets - Bounding Box Density')
Observation:
Above heat map depicts high level view of overall Pneumonia cases
fig, axes = plt.subplots(1, 2, figsize=(14, 6), sharex=True)
pa_bounding_box = graph_density(bbox_groups['pa'], axes[0], 'PA - Bounding Box Density')
ap_bounding_box = graph_density(bbox_groups['ap'], axes[1], 'AP - Bounding Box Density')
Observation:
From the above heat map, we can clearly understand that AP position have more Pneumonia cases that PA position.
fig, axes = plt.subplots(2, 2, figsize=(12, 12))
bbox_1 = graph_density(bbox_groups['bbox_1'], axes[0, 0], '1 Box - Bounding Box Density', n_countour_levels=3)
bbox_2 = graph_density(bbox_groups['bbox_2'], axes[0, 1], '2 Boxes - Bounding Box Density', n_countour_levels=3)
bbox_3 = graph_density(bbox_groups['bbox_3'], axes[1, 0], '3 Boxes - Bounding Box Density', n_countour_levels=1)
bbox_4 = graph_density(bbox_groups['bbox_4'], axes[1, 1], '4 Boxes - Bounding Box Density', n_countour_levels=1)
Observation:
From the above heat map, we can clearly understands that 2 bounding boxes has more Pneumonia count than others
fig, axes = plt.subplots(1, 2, figsize=(12, 6))
bbox_f = graph_density(bbox_groups['f'], axes[0], 'PatientSex - F', n_countour_levels=3)
bbox_m = graph_density(bbox_groups['m'], axes[1], 'PatientSex - M', n_countour_levels=3)
Observation:
From the above heat maps, we can clearly understand that Male patients has more Pneumonia than Female.
fig, axes = plt.subplots(1, 3, figsize=(18, 6))
bbox_age_above_60 = graph_density(bbox_groups['age_above_60'], axes[0], 'PatientAge > 60', n_countour_levels=3)
bbox_age_40_to_60 = graph_density(bbox_groups['age_40_to_60'], axes[1], '40 >= PatientAge <= 60', n_countour_levels=3)
bbox_age_below_40 = graph_density(bbox_groups['age_below_40'], axes[2], 'PatientAge < 40', n_countour_levels=3)
Observation:
From the above heat maps, its clear that Pneumonia is infected age between 40 and 60.
# Splitting into training and validation dataset for input into data generator functions
from sklearn.model_selection import train_test_split
df = merged_df[['patientId','Target']]
df = df.drop_duplicates()
patientId = df['patientId']
Target = df['Target']
#X_train, X_val, Y_train, Y_val = train_test_split(patientId,Target, test_size=0.2, random_state=7)
#X_train, X_val, Y_train, Y_val = train_test_split(patientId,Target, test_size=0.3, random_state=42)
X_train, X_val, Y_train, Y_val = train_test_split(patientId,Target, test_size=0.4, random_state=42)
# Print the distribution of labels between the training and validation dataset
print("Ratio of Pnuemonia to Non-Pnuemonia Labels in training dataset is: {}".format(round(Y_train.value_counts()[1] \
/len(Y_train),2)))
print("Ratio of Pnuemonia to Non-Pnuemonia Labels in validation dataset is: {}".format(round(Y_val.value_counts()[1] \
/len(Y_val),2)))
print("No. of records in training dataset is: {}".format(len(X_train)))
print("No. of records in validation dataset is: {}".format(len(X_val)))
Ratio of Pnuemonia to Non-Pnuemonia Labels in training dataset is: 0.22 Ratio of Pnuemonia to Non-Pnuemonia Labels in validation dataset is: 0.23 No. of records in training dataset is: 16010 No. of records in validation dataset is: 10674
Data Processing
Extract Data from DICOM file
merged_df_sample = merged_df.sample(n=30227)
temp_data_directory = root_path + 'working_data'
print(temp_data_directory)
#os.mkdir(f'{temp_data_directory}')
#os.mkdir(f'{temp_data_directory}/positive')
#os.mkdir(f'{temp_data_directory}/negative')
/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data
for value in merged_df_sample['patientId']:
!cp '/content/drive/MyDrive/AI_ML_Projects/Capstone Project/stage_2_train_images/{value}.dcm' '/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/'
merged_df_sample['path']=f'{temp_data_directory}'+'/'+merged_df_sample['patientId'].astype(str)+'.dcm'
print(temp_data_directory)
#print(path)
/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data
import pydicom
age_list, sex_list, vp_list, studyid_list = [], [], [], []
merged_df_sample['age'] = 0
merged_df_sample['sex'] = ''
merged_df_sample['ViewPosition'] = ''
merged_df_sample['StudyID'] = ''
counter = 0
for value in merged_df_sample['patientId']:
counter = counter + 1
patient_Target = (pd.to_numeric(merged_df_sample[merged_df_sample['patientId'] == value]['Target'],downcast ='signed', errors='coerce')) < 1
dcm_patientFile = '/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/%s.dcm' % value
dcm_patientData = pydicom.read_file(dcm_patientFile)
age_list.append(dcm_patientData.PatientAge)
sex_list.append(dcm_patientData.PatientSex)
vp_list.append(dcm_patientData.ViewPosition)
studyid_list.append(dcm_patientData.StudyID)
merged_df_sample['age'] = age_list
merged_df_sample['sex'] = sex_list
merged_df_sample['ViewPosition'] = vp_list
merged_df_sample['StudyID'] = studyid_list
merged_df_sample.head(10)
merged_df_sample.info()
<class 'pandas.core.frame.DataFrame'> Int64Index: 30227 entries, 8401 to 22308 Data columns (total 8 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 patientId 30227 non-null object 1 class 30227 non-null object 2 x 9555 non-null float64 3 y 9555 non-null float64 4 width 9555 non-null float64 5 height 9555 non-null float64 6 Target 30227 non-null int64 7 path 30227 non-null object dtypes: float64(4), int64(1), object(3) memory usage: 2.1+ MB
Splitting Data into relative classes
negative_info=merged_df_sample[merged_df_sample['Target']==0]
print(len(negative_info))
negative_info.head()
20672
| patientId | class | x | y | width | height | Target | path | |
|---|---|---|---|---|---|---|---|---|
| 8401 | 5f23653b-0afd-4a1d-94fe-865d31302d97 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | /content/drive/MyDrive/AI_ML_Projects/Capstone... |
| 24331 | daa5977a-a1f6-4e3d-9d50-7002d0a3930e | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | /content/drive/MyDrive/AI_ML_Projects/Capstone... |
| 11615 | 7863d975-f4da-486d-80ec-2bebc5f8b089 | Normal | NaN | NaN | NaN | NaN | 0 | /content/drive/MyDrive/AI_ML_Projects/Capstone... |
| 8599 | 60b743c3-cf7f-4770-9926-ab3e8b76ecba | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | /content/drive/MyDrive/AI_ML_Projects/Capstone... |
| 9969 | 6b80c461-49ac-4e42-af05-bbc7fcb1e0ad | Normal | NaN | NaN | NaN | NaN | 0 | /content/drive/MyDrive/AI_ML_Projects/Capstone... |
positive_info=merged_df_sample[merged_df_sample['Target']==1]
unique_positive=positive_info[['path','patientId']]
#print(path)
path=unique_positive['path'].unique()
#print(path)
patientId=unique_positive['patientId'].unique()
unique_positive=pd.DataFrame({'path':path,'patientId':patientId})
len(unique_positive)
6012
from tqdm import tqdm
from skimage.transform import resize
for _,row in tqdm(unique_positive.iterrows()):
img=pydicom.read_file(row['path']).pixel_array
img=resize(img,(256,256))
plt.imsave(f'/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/positive/'+row['patientId']+'.jpg',img,cmap='gray')
6012it [06:58, 14.38it/s]
negtive_info=merged_df_sample[merged_df_sample['Target']==0]
negative=negative_info[['path','patientId']]
#print(path)
path=negative['path'].unique()
#print(path)
patientId=negative['patientId'].unique()
negative=pd.DataFrame({'path':path,'patientId':patientId})
len(negative)
20672
for _,row in tqdm(negative.iterrows()):
img=pydicom.read_file(row['path']).pixel_array
img=resize(img,(256,256))
plt.imsave(f'/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/negative/'+row['patientId']+'.jpg',img,cmap='gray')
20672it [23:35, 14.60it/s]
from tensorflow.keras.applications.vgg19 import VGG19,preprocess_input
datagen=ImageDataGenerator(samplewise_center=True,samplewise_std_normalization=True,horizontal_flip=True,
width_shift_range=0.05,rescale=1/255,fill_mode='nearest',height_shift_range=0.05,
preprocessing_function=preprocess_input,validation_split=0.1,
)
train=datagen.flow_from_directory('/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data',color_mode='rgb',batch_size=128,class_mode='binary',subset='training')
test=datagen.flow_from_directory('/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data',color_mode='rgb',batch_size=32,class_mode='binary',subset='validation')
Found 24016 images belonging to 2 classes. Found 2668 images belonging to 2 classes.
train.class_indices
{'negative': 0, 'positive': 1}
print(tf.__version__)
2.7.0
import pydicom
print(pydicom.__version__)
2.2.2
output_path = '/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/'
patient_df_path = output_path + 'patient_df.pb'
train_path = '/content/drive/MyDrive/AI_ML_Projects/Capstone Project/stage_2_train_images/'
input_path="/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/"
# output weights filepaths
#cust_unet_path = './cust_unet_seg.h5'
#mobilenet_unet_path = './umobilnet_seg.h5'
#basic_cnn_path = './basic_cnn_seg.h5'
basic_unet_i64_b64_e6_path = '/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/basic_unet_i64_b64_e6_seg.h5'
# Pickle output files
#basic_cnn_history = output_path + 'BasicCNN_history.pickle'
#basic_unet_history = output_path + 'BasicUnet_history.pickle'
#unet_mobnet_history = output_path + 'unet_mobnet_history.pickle'
basic_unet_i64_b64_e6_history = output_path + 'basic_unet_i64_b64_e6_history.pickle'
#basic_cnn_creport = output_path +"BasicCNN_creport.pickle"
#basic_unet_creport = output_path + "BasicUnet_creport.pickle"
#unet_mobnet_creport = output_path +'unet_mobnet_creport.pickle"
basic_unet_i64_b64_e6_creport = output_path + 'basic_unet_i64_b64_e6_creport.pickle'
# Model filepaths
#basic_cnn_model = output_path + 'BasicCNN_model.h5'
#basic_unet_model = output_path + "BasicUnet_model.h5"
#unet_mobnet_model = output_path + "unet_mobnet_model.h5"
basic_unet_i64_b64_e6_model_path = output_path + 'basic_unet_i64_b64_e6_model.h5'
IMAGE_SHAPE = [1024,1024]
IMG_PATH = train_path
INPUT_SIZE_64 = [64,64,3]
BATCH_SIZE_8 = 8
BATCH_SIZE_32 = 32
BATCH_SIZE_64 = 64
EPOCH_SIZE = 6
def get_img(img_path, patientId):
'''Function to get an pixel array image from dcm file'''
dcm_filename = img_path + patientId + '.dcm'
dcm_img = pydicom.read_file(dcm_filename)
img = dcm_img.pixel_array
return dcm_filename,img
def get_mask(bboxes, input_size):
'''Function to get mask given the bounding box co-ordinates and the image_size'''
# add 1's at the location of pneumonia
mask = np.zeros(IMAGE_SHAPE) # Black background
for bbox in bboxes:
if ( not np.isnan(bbox).any()):
xmin, ymin, width, height = [int(i) for i in bbox] # Get box co-ordinates
mask[ymin:ymin+height, xmin:xmin+width] = 1 # Color Highlight the pneumonia area
mask = resize(mask,input_size, mode='reflect') # resize output mask to input size
return (mask)
def draw_box(image, box):
'''Function to draw a rectangle on an image'''
# Convert coordinates to integers
box = map(int, box)
# Extract coordinates for rectangle
x, y, width, height = box
x1 = x + width
y1 = y + height
color = (255, 0, 0)
thickness = 8
# Draw a rectangle with line borders of thickness of 8 px
image = cv2.rectangle(image, (x,y), (x1,y1), color, thickness)
return image
def get_boxes(patient_df, patientId):
''' Return bboxes for the given patientId'''
return patient_df[patient_df['patientId'] == patientId]['bboxes'].values[0]
def draw_bounded_image(img_path, patient_df, patientId):
'''Bound the image with boxes covering the affected pneumonia areas'''
# Get pixel image array
file,image = get_img(img_path, patientId)
# Get box co-ordinates for this patient
boxes = get_boxes(patient_df, patientId)
# Add the boxes to image
for box in boxes:
if ( np.isnan(np.sum(boxes)).any() ) : # Check if image is of a person without penumonia i.e box = NaN)
break;
else:
image = draw_box(image=image, box=box) # Overlay the box on the image
return file,image
def printTwoImgs(img1, img2, title1, title2):
fig, ax = plt.subplots(1, 2, figsize=(10,10))
ax[0].imshow(img1, cmap="gray")
ax[1].imshow(img2, cmap="gray")
ax[0].set_title(title1)
ax[1].set_title(title2)
ax[0].axis('off')
ax[1].axis('off')
# Define the preprocess input function to be called from CustomDataGen
def preprocess_input(X):
''' Function to preprocess image. This can vary if we use imported model architectures'''
return X/255
class CustomDataGen(tf.keras.utils.Sequence):
'''Define Custom Data Generator'''
# Keras ImageDataGenerator cannot be used because of not able to handle DICOM images easily
# as well as to create masks in batches, to save memory
# Called on initialization
def __init__(self, df,
batch_size,
input_size=INPUT_SIZE_64,
shuffle=True,
augment = 'no',
):
self.df = df.copy()
self.batch_size = batch_size
self.input_size = input_size
self.shuffle = shuffle
self.augment = augment
self.n = len(self.df)
# Private method called by __getdata. Creates masks for an image
def __getmask(self, bboxes, input_size):
# add 1's at the location of pneumonia
mask = np.zeros(IMAGE_SHAPE) # Black background
for bbox in bboxes:
if ( not np.isnan(bbox).any()):
xmin, ymin, width, height = [int(i) for i in bbox] # Get box co-ordinates
mask[ymin:ymin+height, xmin:xmin+width] = 1 # Color Highlight the pneumonia area
mask = resize(mask,self.input_size, mode='reflect') # resize output mask to input size
return (mask)
# Private method called by __getdata. Gets pixel image array and preprocesses it (UNET backbone)
def __getimage(self, path, image_size):
image_arr = pydicom.read_file(path).pixel_array
image_arr = np.stack((image_arr,)*3, axis=-1) # Expand grayscale image to contain 3 channels
image_arr = resize(image_arr,self.input_size, mode='reflect')
return (image_arr)
# Private method called by __getitem__. Gets X and y for a batch
def __getdata(self, batches):
# Generates data containing batch_size samples of features(image pixels array) and targets(mask and label)
batch_paths = [IMG_PATH + patientId + '.dcm' for patientId in batches['patientId']]
batch_bboxes = batches['bboxes'].values
X_batch = np.asarray([self.__getimage(path, self.input_size) for path in batch_paths])
y_batch = np.asarray([self.__getmask( bbox, self.input_size)
for bbox in batch_bboxes])
return X_batch, y_batch
# Mandatory Sequence class method, called to yield the next batch of data at index
def __getitem__(self, index):
batch_start = index * self.batch_size
batch_end = (index + 1) * self.batch_size
batches = self.df[batch_start:batch_end]
X_batch, y_batch = self.__getdata(batches)
X_batch = preprocess_input(X_batch)
return X_batch, y_batch
# Mandatory Sequence class method. Get number of steps in an epoch that runs all batches to update the model
def __len__(self):
return self.n // self.batch_size
# Optional Sequence class method, called at the end of every epoch during model run.
def on_epoch_end(self):
# Shuffle data at end of every epoch
if self.shuffle:
self.df = self.df.sample(frac=1).reset_index(drop=True)
def conv2d_block(input_tensor, n_filters=16):
"""Function to add 2 convolutional layers with the parameters passed to it"""
# first convolution
x = tf.keras.layers.Conv2D(n_filters, (3,3), kernel_initializer = 'he_normal', padding = 'same')(input_tensor)
x = tf.keras.layers.BatchNormalization()(x)
x = tf.keras.layers.Activation('relu')(x)
# second convolution
x = tf.keras.layers.Conv2D(n_filters, (3,3), kernel_initializer = 'he_normal', padding = 'same')(x)
x = tf.keras.layers.BatchNormalization()(x)
x = tf.keras.layers.Activation('relu')(x)
return x
def create_cnnmodel(input_size=INPUT_SIZE_64):
#Define input layer
input_tensor = tf.keras.layers.Input(input_size, name='input_layer')
#First double convolution block
n_filters = 16
c1 = conv2d_block(input_tensor, n_filters*1)
c2 = conv2d_block(c1, n_filters*2)
c4 = conv2d_block(c2, n_filters*4)
#Build the Output layer
outputs = tf.keras.layers.Conv2D(1, kernel_size=1, activation='sigmoid')(c4)
#Build the model using different layers
model = tf.keras.Model(inputs=[input_tensor], outputs=[outputs])
return model
def down_sample(tensor, dropout):
d = tf.keras.layers.MaxPooling2D((2, 2))(tensor)
return tf.keras.layers.Dropout(dropout)(d)
def up_sample(tensor, n_filters):
return tf.keras.layers.Conv2DTranspose(n_filters, (3, 3),
strides = (2, 2),
padding = 'same')(tensor)
def concat(tensor1, tensor2, dropout):
c = tf.keras.layers.concatenate([tensor1, tensor2])
return tf.keras.layers.Dropout(dropout)(c)
def create_custom_unet(input_size=INPUT_SIZE_64, n_filters=16, dropout=0.4):
''' Function to build a custom UNET model from scratch using CNN'''
#Define input layer
input_tensor = tf.keras.layers.Input(input_size, name='input_layer')
#ENCODER - DOWNSAMPLE the image
#First Block of Encoder
c1 = conv2d_block(input_tensor, n_filters*1) # 64*64*16
e1 = down_sample(c1, dropout)
#Second Block of Encoder
c2 = conv2d_block(e1, n_filters*2) # 32 *32 * 32
e2 = down_sample(c2, dropout)
#Central Block of Encoder
c3 = conv2d_block(e2, n_filters*4) # 16 * 16 * 64
#We now have output of Encoder
#DECODER - UPSAMPLE the feature to generate mask
# Central Block of Decoder
d3 = conv2d_block(c3, n_filters*4) # 16 * 16 * 64
# First Block of Decoder
d2 = up_sample(d3, n_filters*4)
d2 = concat(d2,c2, dropout) #First Block of Decoder - connected to Second block on Encoder side
c4 = conv2d_block(d2, n_filters * 2)# 32 * 32 * 32
# Second Block of Decoder
d1 = up_sample(c4, n_filters*2)
d1 = concat(d1,c1,dropout) #Second Block of Decoder - connected to First block on Encoder side
c5 = conv2d_block(d1, n_filters*1)# 64 * 64 * 16
#Build the Output layer
outputs = tf.keras.layers.Conv2D(1, kernel_size=1, activation='sigmoid')(c5)
#Build the model using different layers
model = tf.keras.Model(inputs=[input_tensor], outputs=[outputs])
return model
def create_backbone_unet(backbone, input_size=INPUT_SIZE_64, dropout=0.4):
'''Function to create UNET model from mobilenet backbone'''
#Freeze Encoder layers
for layer in backbone.layers:
layer.trainable = False
model = backbone
# Build Decoder
block1 = model.get_layer("conv_pw_1_relu").output
block2 = model.get_layer("conv_pw_3_relu").output
block3 = model.get_layer("conv_pw_5_relu").output
block6 = model.get_layer("conv_pw_11_relu").output
block7 = model.get_layer("conv_pw_13_relu").output
n_filters = 32
x = Concatenate()([UpSampling2D()(block7), block6])
x = conv2d_block(x,n_filters * 8)
x = Concatenate()([UpSampling2D()(x), block3])
x = conv2d_block(x,n_filters * 4)
x = Concatenate()([UpSampling2D()(x), block2])
x = conv2d_block(x,n_filters * 2)
x = Concatenate()([UpSampling2D()(x), block1])
x = conv2d_block(x,n_filters)
x = UpSampling2D()(x)
x = conv2d_block(x,n_filters)
# Output layer
x = Conv2D(1, kernel_size=1, kernel_initializer='he_normal', padding='same')(x)
x = tf.keras.layers.Activation('sigmoid')(x)
return Model(inputs=model.input, outputs=x)
def create_unet_model(backbone='mobilenet', input_size=INPUT_SIZE_64, dropout=0.4):
''' Creates UNET model from different backbone, including no backbone(custom UNET from scratch)'''
model = MobileNet(input_shape=input_size, include_top=False, alpha=1.0, weights = 'imagenet', dropout=0.4)
model = create_backbone_unet(model)
if (backbone == 'None'):
model = create_custom_unet()
return model
# Custom metric and loss functions for validation
def dice_coefficient(y_true, y_pred):
numerator = 2 * tf.reduce_sum(y_true * y_pred)
denominator = tf.reduce_sum(y_true + y_pred)
return numerator / (denominator + tf.keras.backend.epsilon())
def loss(y_true, y_pred):
return binary_crossentropy(y_true, y_pred) - tf.keras.backend.log(dice_coefficient(y_true, y_pred) + tf.keras.backend.epsilon())
def PlotMetrics(history):
plt.figure(figsize=(8, 5))
plt.grid(True)
plt.plot(history.history['dice_coefficient'], label='Train Dice-Coef', color = "green" );
plt.plot(history.history['val_dice_coefficient'], label='Val Dice-Coef', color = "yellow");
plt.plot(history.history['loss'], label='Train Loss', color = "red" );
plt.plot(history.history['val_loss'], label='Val Loss', color = "orange");
plt.title("Validation and Training - Loss and Dice Coefficient vs Epoch")
plt.xlabel("Epoch")
plt.legend();
def predict_imagemask(model, patientId, patient_df, train_path, threshold=0.5, input_size=INPUT_SIZE_64):
'''Function to predict a mask for an image'''
# Get image pixel array
_,image = get_img(train_path, patientId)
image = resize(image,input_size, mode='reflect')
# Get bboxes and expected mask
bboxes = patient_df[patient_df['patientId'] == patientId]['bboxes'].values[0]
mask = get_mask(bboxes, input_size)
# Prepare the image for feeding into model for
image_rescaled = preprocess_input(np.expand_dims(image,0))
# Predict mask for image
pred_mask = model.predict(image_rescaled)
# remove the batch dimension
pred_mask = pred_mask[0]
# Mask contains probabilities from sigmoid function. Convert to 0 or 1 values for the pixels, by using a threshold
pred_mask = (pred_mask > threshold) * 1.0
return (image,mask,pred_mask)
def get_predictions(model,val_df,train_path, input_size):
y_pred = []
for patientId in val_df['patientId']:
image,mask,pred_mask = predict_imagemask(model,patientId, val_df, train_path, input_size=input_size)
y_pred.append((1 in pred_mask) * 1)
y_true = val_df['Target']
return y_true, y_pred
def print_confusion_matrix(y_true, y_pred):
'''Function to print confusion_matrix'''
# Get confusion matrix array
array = confusion_matrix(y_true, y_pred)
df_cm = pd.DataFrame(array, range(2), range(2))
print("Total samples = ", len(val_df))
# Plot heatmap and get sns heatmap values
sns.set(font_scale=1.4); # for label size
result = sns.heatmap(df_cm, annot=True, annot_kws={"size": 16}, fmt='g', cbar=False);
# Add labels to heatmap
labels = ['TN=','FP=','FN=','TP=']
i=0
for t in result.texts:
t.set_text(labels[i] + t.get_text())
i += 1
plt.xlabel("Predicted Values")
plt.ylabel('True Values')
plt.show()
return
def print_image_mask(image, mask, pred_mask):
# Plot three images side by side
fig, ax = plt.subplots(1, 3, figsize=(10,10))
ax[0].imshow(image, cmap=plt.cm.gist_gray) # Show the image
ax[0].axis('off') # Remove axis
ax[0].set_title("IMAGE")
ax[1].imshow(mask*image, cmap=plt.cm.gist_gray)
ax[1].set_title("Expected Mask")
ax[1].axis('off')
# Visualize mask superimposed on image
masked = mask * image
ax[2].imshow(masked)
ax[2].imshow(pred_mask * image)
ax[2].set_title("Predicted Mask")
ax[2].axis('off')
plt.show()
labels_file = '/content/drive/MyDrive/AI_ML_Projects/Capstone Project/stage_2_train_labels.csv'
class_file = '/content/drive/MyDrive/AI_ML_Projects/Capstone Project/stage_2_detailed_class_info.csv'
labels_df = pd.read_csv(labels_file)
classes_df = pd.read_csv(class_file)
def get_single_df(patient_df_df,classes_df,df_path, images_path):
''' Function to add relevant dicom image metadata columns into dataframe'''
# If the session has a prestored dataframe pickle use it.
if (os.path.isfile(df_path)):
print("Reading prestored dataframe object")
patient_df = pd.read_pickle(df_path)
return patient_df
bboxes = [patient_df_df[patient_df_df['patientId'] == patientId].loc[:,'x':'height'].values
for patientId in patient_df_df['patientId']] # Get list of bboxes for each patientId
patient_df = patient_df_df.copy()
patient_df.insert(1, 'bboxes', bboxes ) # Add new bboxes column
patient_df.insert(2, 'class', classes_df['class']) # Add class column
# Remove x,y, width, height columns and drop the duplicate rows of patientIds
patient_df = patient_df.drop(columns=['x','y','width','height']).drop_duplicates(subset=['patientId'], ignore_index=True)
# Initialize columns for DICOM metadata
patient_df['gender'] = np.nan
patient_df['age'] = np.nan
patient_df['viewpos'] = np.nan
patient_df['dimx'] = np.nan
patient_df['dimy'] = np.nan
for patientId in patient_df['patientId']:
# Get dcm filename
dcm_file = images_path + patientId + '.dcm'
ds = pydicom.dcmread(dcm_file)
# Get the row indices of patientId in dataframe
indices = patient_df.index[patient_df['patientId'] == patientId].tolist()
# Add the sex, age, viewpos, image dimensions data to the dataframe
patient_df.at[indices,'gender'] = ds.PatientSex
patient_df.at[indices,'age'] = ds.PatientAge
patient_df.at[indices , 'viewpos'] = ds[0x0018, 0x5101].value
patient_df.at[indices , 'dimx'] = ds[0x0028, 0x0010].value
patient_df.at[indices , 'dimy'] = ds[0x0028, 0x0011].value
# Store the updated patient_df dataframe as an object for faster retrieval in subsequent sessions
patient_df.to_pickle(df_path) # (not using to_csv because it stores bboxes lists as strings).
return patient_df
# Get a single dataframe by merging the labels_df and classes_df dataframes and adding relevant DICOM metadata columnspath = patient_df_path
df_path = patient_df_path # dataframe pickle object file to store or retrieve to/from disk
images_path = train_path # path where DICOM images are stored
patient_df = get_single_df(labels_df,classes_df,patient_df_path, images_path)
patient_df.head()
Reading prestored dataframe object
| patientId | bboxes | class | Target | gender | age | viewpos | dimx | dimy | |
|---|---|---|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | [[nan, nan, nan, nan]] | No Lung Opacity / Not Normal | 0 | F | 51 | PA | 1024.0 | 1024.0 |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | [[nan, nan, nan, nan]] | No Lung Opacity / Not Normal | 0 | F | 48 | PA | 1024.0 | 1024.0 |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | [[nan, nan, nan, nan]] | No Lung Opacity / Not Normal | 0 | M | 19 | AP | 1024.0 | 1024.0 |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | [[nan, nan, nan, nan]] | Normal | 0 | M | 28 | PA | 1024.0 | 1024.0 |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | [[264.0, 152.0, 213.0, 379.0], [562.0, 152.0, ... | Lung Opacity | 1 | F | 32 | AP | 1024.0 | 1024.0 |
patient_df.shape
(26684, 9)
print('X dimension of images:' , patient_df['dimx'].value_counts())
print('Y dimension of images:', patient_df['dimy'].value_counts())
X dimension of images: 1024.0 26684 Name: dimx, dtype: int64 Y dimension of images: 1024.0 26684 Name: dimy, dtype: int64
# To split the dataframe form an array of all the indices
indices = range( len(patient_df))
target = patient_df['Target'] # for stratification split
# First split to get 0.2 of the dataset to use as total samples to input for model
sample_indices, _, sample_target,_ = train_test_split(indices,target, test_size=0.5, random_state=42, stratify=target)
# Split those indices further into train and val indices
train_indices, val_indices, _,_ = train_test_split(sample_indices, sample_target, test_size=0.2, random_state=42, stratify=sample_target)
# Get the train and validation dataframes from the split indices
patient_data = patient_df[['patientId','bboxes','Target']]
train_df = patient_data.loc[train_indices]
val_df = patient_data.loc[val_indices]
print("Total Samples: ",len(patient_df) )
total_sample_size = len(train_df) + len(val_df)
print("Total samples selected for model training:", total_sample_size)
print("Train samples:", len(train_df))
print("Validation samples:", len(val_df))
print("Ratio of train to validation samples: %0.1f : %0.1f" %(len(train_df)/total_sample_size, len(val_df)/len(sample_indices) ))
print("Positive samples in original dataset: %0.0f%%" %((len(patient_df[patient_df['Target'] == 1])/ len(patient_df))*100))
print("Positive samples in train dataset: %0.0f%%" %((len(train_df[train_df['Target'] == 1]) /len(train_df)) * 100))
print("Positive samples in test dataset:%0.0f%%" %((len(val_df[val_df['Target'] == 1])/len(val_df))*100))
Total Samples: 26684 Total samples selected for model training: 13342 Train samples: 10673 Validation samples: 2669 Ratio of train to validation samples: 0.8 : 0.2 Positive samples in original dataset: 23% Positive samples in train dataset: 23% Positive samples in test dataset:23%
BACKBONE = 'None'
basic_unet_i64_b64_e6_model = create_unet_model(BACKBONE, input_size=INPUT_SIZE_64)
if (os.path.isfile(basic_unet_i64_b64_e6_path)):
print("LOADING WEIGHTS FROM PREVIOUS MODEL\n")
basic_unet_i64_b64_e6_model.load_weights(basic_unet_i64_b64_e6_path, by_name=True)
basic_unet_i64_b64_e6_model.summary()
WARNING:tensorflow:`input_shape` is undefined or non-square, or `rows` is not in [128, 160, 192, 224]. Weights for input shape (224, 224) will be loaded as the default.
LOADING WEIGHTS FROM PREVIOUS MODEL
Model: "model_1"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_layer (InputLayer) [(None, 64, 64, 3)] 0 []
conv2d_11 (Conv2D) (None, 64, 64, 16) 448 ['input_layer[0][0]']
batch_normalization_10 (BatchN (None, 64, 64, 16) 64 ['conv2d_11[0][0]']
ormalization)
activation_11 (Activation) (None, 64, 64, 16) 0 ['batch_normalization_10[0][0]']
conv2d_12 (Conv2D) (None, 64, 64, 16) 2320 ['activation_11[0][0]']
batch_normalization_11 (BatchN (None, 64, 64, 16) 64 ['conv2d_12[0][0]']
ormalization)
activation_12 (Activation) (None, 64, 64, 16) 0 ['batch_normalization_11[0][0]']
max_pooling2d (MaxPooling2D) (None, 32, 32, 16) 0 ['activation_12[0][0]']
dropout (Dropout) (None, 32, 32, 16) 0 ['max_pooling2d[0][0]']
conv2d_13 (Conv2D) (None, 32, 32, 32) 4640 ['dropout[0][0]']
batch_normalization_12 (BatchN (None, 32, 32, 32) 128 ['conv2d_13[0][0]']
ormalization)
activation_13 (Activation) (None, 32, 32, 32) 0 ['batch_normalization_12[0][0]']
conv2d_14 (Conv2D) (None, 32, 32, 32) 9248 ['activation_13[0][0]']
batch_normalization_13 (BatchN (None, 32, 32, 32) 128 ['conv2d_14[0][0]']
ormalization)
activation_14 (Activation) (None, 32, 32, 32) 0 ['batch_normalization_13[0][0]']
max_pooling2d_1 (MaxPooling2D) (None, 16, 16, 32) 0 ['activation_14[0][0]']
dropout_1 (Dropout) (None, 16, 16, 32) 0 ['max_pooling2d_1[0][0]']
conv2d_15 (Conv2D) (None, 16, 16, 64) 18496 ['dropout_1[0][0]']
batch_normalization_14 (BatchN (None, 16, 16, 64) 256 ['conv2d_15[0][0]']
ormalization)
activation_15 (Activation) (None, 16, 16, 64) 0 ['batch_normalization_14[0][0]']
conv2d_16 (Conv2D) (None, 16, 16, 64) 36928 ['activation_15[0][0]']
batch_normalization_15 (BatchN (None, 16, 16, 64) 256 ['conv2d_16[0][0]']
ormalization)
activation_16 (Activation) (None, 16, 16, 64) 0 ['batch_normalization_15[0][0]']
conv2d_17 (Conv2D) (None, 16, 16, 64) 36928 ['activation_16[0][0]']
batch_normalization_16 (BatchN (None, 16, 16, 64) 256 ['conv2d_17[0][0]']
ormalization)
activation_17 (Activation) (None, 16, 16, 64) 0 ['batch_normalization_16[0][0]']
conv2d_18 (Conv2D) (None, 16, 16, 64) 36928 ['activation_17[0][0]']
batch_normalization_17 (BatchN (None, 16, 16, 64) 256 ['conv2d_18[0][0]']
ormalization)
activation_18 (Activation) (None, 16, 16, 64) 0 ['batch_normalization_17[0][0]']
conv2d_transpose (Conv2DTransp (None, 32, 32, 64) 36928 ['activation_18[0][0]']
ose)
concatenate_4 (Concatenate) (None, 32, 32, 96) 0 ['conv2d_transpose[0][0]',
'activation_14[0][0]']
dropout_2 (Dropout) (None, 32, 32, 96) 0 ['concatenate_4[0][0]']
conv2d_19 (Conv2D) (None, 32, 32, 32) 27680 ['dropout_2[0][0]']
batch_normalization_18 (BatchN (None, 32, 32, 32) 128 ['conv2d_19[0][0]']
ormalization)
activation_19 (Activation) (None, 32, 32, 32) 0 ['batch_normalization_18[0][0]']
conv2d_20 (Conv2D) (None, 32, 32, 32) 9248 ['activation_19[0][0]']
batch_normalization_19 (BatchN (None, 32, 32, 32) 128 ['conv2d_20[0][0]']
ormalization)
activation_20 (Activation) (None, 32, 32, 32) 0 ['batch_normalization_19[0][0]']
conv2d_transpose_1 (Conv2DTran (None, 64, 64, 32) 9248 ['activation_20[0][0]']
spose)
concatenate_5 (Concatenate) (None, 64, 64, 48) 0 ['conv2d_transpose_1[0][0]',
'activation_12[0][0]']
dropout_3 (Dropout) (None, 64, 64, 48) 0 ['concatenate_5[0][0]']
conv2d_21 (Conv2D) (None, 64, 64, 16) 6928 ['dropout_3[0][0]']
batch_normalization_20 (BatchN (None, 64, 64, 16) 64 ['conv2d_21[0][0]']
ormalization)
activation_21 (Activation) (None, 64, 64, 16) 0 ['batch_normalization_20[0][0]']
conv2d_22 (Conv2D) (None, 64, 64, 16) 2320 ['activation_21[0][0]']
batch_normalization_21 (BatchN (None, 64, 64, 16) 64 ['conv2d_22[0][0]']
ormalization)
activation_22 (Activation) (None, 64, 64, 16) 0 ['batch_normalization_21[0][0]']
conv2d_23 (Conv2D) (None, 64, 64, 1) 17 ['activation_22[0][0]']
==================================================================================================
Total params: 240,097
Trainable params: 239,201
Non-trainable params: 896
__________________________________________________________________________________________________
print("Number of layers in the model = ", len(basic_unet_i64_b64_e6_model.layers))
Number of layers in the model = 48
optimizer = Adam(learning_rate=1e-4, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0, amsgrad=False)
basic_unet_i64_b64_e6_model.compile(loss= loss, optimizer=optimizer, metrics=[dice_coefficient])
traingen = CustomDataGen(train_df,batch_size=BATCH_SIZE_64, input_size=INPUT_SIZE_64)
valgen = CustomDataGen(val_df,batch_size=BATCH_SIZE_64, input_size=INPUT_SIZE_64)
#Saving the best model using model checkpoint callback.
outfile = basic_unet_i64_b64_e6_path # update with new weights
model_checkpoint=tf.keras.callbacks.ModelCheckpoint(outfile,
save_best_only=True,
monitor='val_dice_coefficient',
mode='max',
verbose=1)
#es = tf.keras.callbacks.EarlyStopping(monitor='val_dice_coefficient', mode='max', verbose=1, patience=3)
history = basic_unet_i64_b64_e6_model.fit(traingen,
epochs=EPOCH_SIZE,
validation_data=valgen,
callbacks=[model_checkpoint],
use_multiprocessing=True,
workers=4,
shuffle=True,
verbose=1)
Epoch 1/6 166/166 [==============================] - ETA: 0s - loss: 3.4209 - dice_coefficient: 0.0727 Epoch 00001: val_dice_coefficient improved from -inf to 0.05298, saving model to /content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/basic_unet_i64_b64_e6_seg.h5 166/166 [==============================] - 2329s 14s/step - loss: 3.4209 - dice_coefficient: 0.0727 - val_loss: 10.9402 - val_dice_coefficient: 0.0530 Epoch 2/6 165/166 [============================>.] - ETA: 10s - loss: 2.9405 - dice_coefficient: 0.0986 Epoch 00002: val_dice_coefficient improved from 0.05298 to 0.06334, saving model to /content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/basic_unet_i64_b64_e6_seg.h5 166/166 [==============================] - 2317s 14s/step - loss: 2.9366 - dice_coefficient: 0.0990 - val_loss: 6.0116 - val_dice_coefficient: 0.0633 Epoch 3/6 166/166 [==============================] - ETA: 0s - loss: 2.7024 - dice_coefficient: 0.1112 Epoch 00003: val_dice_coefficient improved from 0.06334 to 0.07558, saving model to /content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/basic_unet_i64_b64_e6_seg.h5 166/166 [==============================] - 2347s 14s/step - loss: 2.7024 - dice_coefficient: 0.1112 - val_loss: 3.9881 - val_dice_coefficient: 0.0756 Epoch 4/6 166/166 [==============================] - ETA: 0s - loss: 2.5698 - dice_coefficient: 0.1200 Epoch 00004: val_dice_coefficient did not improve from 0.07558 166/166 [==============================] - 2347s 14s/step - loss: 2.5698 - dice_coefficient: 0.1200 - val_loss: 3.0552 - val_dice_coefficient: 0.0685 Epoch 5/6 166/166 [==============================] - ETA: 0s - loss: 2.4555 - dice_coefficient: 0.1287 Epoch 00005: val_dice_coefficient improved from 0.07558 to 0.07991, saving model to /content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/basic_unet_i64_b64_e6_seg.h5 166/166 [==============================] - 2335s 14s/step - loss: 2.4555 - dice_coefficient: 0.1287 - val_loss: 4.7981 - val_dice_coefficient: 0.0799 Epoch 6/6 166/166 [==============================] - ETA: 0s - loss: 2.3396 - dice_coefficient: 0.1387 Epoch 00006: val_dice_coefficient improved from 0.07991 to 0.09451, saving model to /content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/basic_unet_i64_b64_e6_seg.h5 166/166 [==============================] - 2346s 14s/step - loss: 2.3396 - dice_coefficient: 0.1387 - val_loss: 3.6342 - val_dice_coefficient: 0.0945
history.history
{'dice_coefficient': [0.07273805886507034,
0.09901139885187149,
0.1112402006983757,
0.11998417973518372,
0.12869375944137573,
0.13874560594558716],
'loss': [3.4209039211273193,
2.9366328716278076,
2.702406644821167,
2.5698189735412598,
2.45550799369812,
2.339646100997925],
'val_dice_coefficient': [0.05297818407416344,
0.06333903223276138,
0.07557635009288788,
0.06851941347122192,
0.07990893721580505,
0.09450744092464447],
'val_loss': [10.9402494430542,
6.011646747589111,
3.9880969524383545,
3.055246353149414,
4.7981038093566895,
3.6342408657073975]}
print("Basic UNET Model\n")
PlotMetrics(history)
Basic UNET Model
# Pickle dump the history dictionary
filename = basic_unet_i64_b64_e6_history
filehandle = open(filename, "wb")
pickle.dump(history.history, filehandle)
filehandle.close()
basic_unet_i64_b64_e6_model.load_weights(basic_unet_i64_b64_e6_path)
# Get predictions for Validation dataset and evaluate model performance
y_true,y_pred = get_predictions(basic_unet_i64_b64_e6_model, val_df, train_path, input_size=INPUT_SIZE_64)
print("BASIC UNET MODEL - CONFUSION MATRIX\n")
print_confusion_matrix(y_true, y_pred)
BASIC UNET MODEL - CONFUSION MATRIX Total samples = 2669
print("BASIC UNET MODEL : CLASSIFICATION REPORT\n")
print(classification_report(y_true, y_pred))
BASIC UNET MODEL : CLASSIFICATION REPORT
precision recall f1-score support
0 1.00 0.00 0.00 2068
1 0.23 1.00 0.37 601
accuracy 0.23 2669
macro avg 0.61 0.50 0.18 2669
weighted avg 0.83 0.23 0.08 2669
# Pickle dump the report dictionary
report = classification_report(y_true, y_pred, output_dict=True)
filename = basic_unet_i64_b64_e6_creport
filehandle = open(filename, "wb")
pickle.dump(report, filehandle)
filehandle.close()
# load the best weights and save using model.save to make sure the model can be retrieved later
best_weights = basic_unet_i64_b64_e6_path
basic_unet_i64_b64_e6_model.load_weights(best_weights)
best_model_path = basic_unet_i64_b64_e6_model_path
basic_unet_i64_b64_e6_model.save(best_model_path)
plt.figure(figsize=(20,10))
fpri,tpri,_=roc_curve(y_true,y_pred[:len(y_true)])
area_under_curvei=auc(fpri,tpri)
print('The area under the curve is:',area_under_curvei)
# Plot area under curve
plt.plot(fpri,tpri,'b.-')
plt.xlabel('false positive rate')
plt.ylabel('true positive rate')
plt.plot(fpri,fpri,linestyle='--',color='black')
The area under the curve is: 0.5002417794970986
[<matplotlib.lines.Line2D at 0x7f2cefca7c50>]
from keras.applications.vgg19 import VGG19
pre_trained_model = VGG19(input_shape = (256,256,3), include_top = False, weights = 'imagenet')
for layer in pre_trained_model.layers:
layer.trainable = False
# pre_trained_model.summary()
last_layer = pre_trained_model.get_layer('block5_pool')
print('last layer output shape: ', last_layer.output_shape)
last_output = last_layer.output
from tensorflow.keras.layers import Flatten,Dense,Dropout,BatchNormalization,LeakyReLU,GaussianDropout
model = Flatten()(last_output)
model = Dense(1024)(model)
model=LeakyReLU(0.1)(model)
model=Dropout(0.25)(model)
model=BatchNormalization()(model)
model = Dense(1024)(model)
model=LeakyReLU(0.1)(model)
model=Dropout(0.25)(model)
model=BatchNormalization()(model)
model = Dense(1, activation='sigmoid')(model)
from tensorflow.keras.models import Model
vgg19model = Model( pre_trained_model.input, model)
vgg19model.compile(optimizer = 'adam',
loss = 'binary_crossentropy',
metrics = ['accuracy'])
from tensorflow.keras.callbacks import EarlyStopping,ReduceLROnPlateau
early=EarlyStopping(monitor='accuracy',patience=3,mode='auto')
reduce_lr = ReduceLROnPlateau(monitor='accuracy', factor=0.5, patience=2, verbose=1,cooldown=0, mode='auto',min_delta=0.0001, min_lr=0)
class_weight={0:1,1:3.3}
vgg19model.fit(train,epochs=20,callbacks=[reduce_lr],steps_per_epoch=100,validation_data=test,class_weight=class_weight)
Epoch 1/20 100/100 [==============================] - 252s 2s/step - loss: 1.1012 - accuracy: 0.6833 - val_loss: 0.4908 - val_accuracy: 0.7646 - lr: 0.0010 Epoch 2/20 100/100 [==============================] - 236s 2s/step - loss: 0.8331 - accuracy: 0.7284 - val_loss: 0.5485 - val_accuracy: 0.7151 - lr: 0.0010 Epoch 3/20 100/100 [==============================] - 235s 2s/step - loss: 0.7715 - accuracy: 0.7467 - val_loss: 0.4876 - val_accuracy: 0.7594 - lr: 0.0010 Epoch 4/20 100/100 [==============================] - 231s 2s/step - loss: 0.7557 - accuracy: 0.7478 - val_loss: 0.4574 - val_accuracy: 0.7785 - lr: 0.0010 Epoch 5/20 100/100 [==============================] - 237s 2s/step - loss: 0.7574 - accuracy: 0.7412 - val_loss: 0.3643 - val_accuracy: 0.8475 - lr: 0.0010 Epoch 6/20 100/100 [==============================] - 240s 2s/step - loss: 0.7368 - accuracy: 0.7516 - val_loss: 0.4071 - val_accuracy: 0.8081 - lr: 0.0010 Epoch 7/20 100/100 [==============================] - 244s 2s/step - loss: 0.7307 - accuracy: 0.7524 - val_loss: 0.3779 - val_accuracy: 0.8385 - lr: 0.0010 Epoch 8/20 100/100 [==============================] - 249s 2s/step - loss: 0.7299 - accuracy: 0.7509 - val_loss: 0.3962 - val_accuracy: 0.8141 - lr: 0.0010 Epoch 9/20 100/100 [==============================] - 245s 2s/step - loss: 0.7145 - accuracy: 0.7614 - val_loss: 0.4658 - val_accuracy: 0.7504 - lr: 0.0010 Epoch 10/20 100/100 [==============================] - 244s 2s/step - loss: 0.7202 - accuracy: 0.7606 - val_loss: 0.4945 - val_accuracy: 0.7298 - lr: 0.0010 Epoch 11/20 100/100 [==============================] - 244s 2s/step - loss: 0.7192 - accuracy: 0.7654 - val_loss: 0.4164 - val_accuracy: 0.8066 - lr: 0.0010 Epoch 12/20 100/100 [==============================] - 243s 2s/step - loss: 0.7098 - accuracy: 0.7584 - val_loss: 0.3907 - val_accuracy: 0.8193 - lr: 0.0010 Epoch 13/20 100/100 [==============================] - 242s 2s/step - loss: 0.7012 - accuracy: 0.7693 - val_loss: 0.3847 - val_accuracy: 0.8235 - lr: 0.0010 Epoch 14/20 100/100 [==============================] - 245s 2s/step - loss: 0.7108 - accuracy: 0.7641 - val_loss: 0.3692 - val_accuracy: 0.8430 - lr: 0.0010 Epoch 15/20 100/100 [==============================] - 245s 2s/step - loss: 0.6777 - accuracy: 0.7743 - val_loss: 0.4006 - val_accuracy: 0.8178 - lr: 0.0010 Epoch 16/20 100/100 [==============================] - 243s 2s/step - loss: 0.7061 - accuracy: 0.7566 - val_loss: 0.4599 - val_accuracy: 0.7493 - lr: 0.0010 Epoch 17/20 100/100 [==============================] - ETA: 0s - loss: 0.6779 - accuracy: 0.7679 Epoch 00017: ReduceLROnPlateau reducing learning rate to 0.0005000000237487257. 100/100 [==============================] - 244s 2s/step - loss: 0.6779 - accuracy: 0.7679 - val_loss: 0.3890 - val_accuracy: 0.8310 - lr: 0.0010 Epoch 18/20 100/100 [==============================] - 245s 2s/step - loss: 0.6677 - accuracy: 0.7749 - val_loss: 0.4164 - val_accuracy: 0.8010 - lr: 5.0000e-04 Epoch 19/20 76/100 [=====================>........] - ETA: 47s - loss: 0.6630 - accuracy: 0.7839
Save the Model
vgg19model.save('/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/vgg19model.h5')
Plotting Accuracy and Validation Accuracy
plt.figure(figsize=(30,20))
val_acc=np.asarray(vgg19model.history.history['val_accuracy'])*100
acc=np.asarray(vgg19model.history.history['accuracy'])*100
acc=pd.DataFrame({'val_acc':val_acc,'acc':acc})
acc.plot(figsize=(20,10),yticks=range(50,100,5))
<matplotlib.axes._subplots.AxesSubplot at 0x7fba48603350>
<Figure size 2160x1440 with 0 Axes>
Plotting Loss and Validation Loss
loss=vgg19model.history.history['loss']
val_loss=vgg19model.history.history['val_loss']
loss=pd.DataFrame({'val_loss':val_loss,'loss':loss})
loss.plot(figsize=(20,10))
<matplotlib.axes._subplots.AxesSubplot at 0x7fba48695e50>
Model testing
y=[]
test.reset()
for i in tqdm(range(4)):
_,target=test.__getitem__(i)
for j in target:
y.append(j)
100%|██████████| 4/4 [00:01<00:00, 2.37it/s]
test.reset()
y_pred=vgg19model.predict(test)
pred=[]
for i in y_pred:
if i[0]>=0.5:
pred.append(1)
else:
pred.append(0)
from sklearn.metrics import roc_curve,auc,precision_recall_curve,classification_report
print(classification_report(y,pred[:len(y)]))
precision recall f1-score support
0.0 0.94 0.89 0.91 101
1.0 0.66 0.78 0.71 27
accuracy 0.87 128
macro avg 0.80 0.83 0.81 128
weighted avg 0.88 0.87 0.87 128
AUC Curve
plt.figure(figsize=(20,10))
fprr,tprr,_=roc_curve(y,y_pred[:len(y)])
area_under_curver=auc(fprr,tprr)
print('The area under the curve is:',area_under_curver)
# Plot area under curve
plt.plot(fprr,tprr,'b.-')
plt.xlabel('false positive rate')
plt.ylabel('true positive rate')
plt.plot(fprr,fprr,linestyle='--',color='black')
The area under the curve is: 0.9321598826549322
[<matplotlib.lines.Line2D at 0x7fba48381510>]
#!pip install tensorflow==1.15.0
#!pip install tensorflow-gpu
#!pip install tf-nightly
from keras.models import load_model
from keras.preprocessing import image
# dimensions of our images
img_width, img_height = 256,256
def load_image(img_path, show=False):
img = image.load_img(img_path, target_size=(256, 256))
img_tensor = image.img_to_array(img) # (height, width, channels)
img_tensor = np.expand_dims(img_tensor, axis=0) # (1, height, width, channels), add a dimension because the model expects this shape: (batch_size, height, width, channels)
img_tensor /= 255. # imshow expects values in the range [0, 1]
if show:
plt.imshow(img_tensor[0])
plt.axis('off')
plt.show()
return img_tensor
# load the model we saved
model = load_model('/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/vgg19model.h5')
model.compile(loss='binary_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
# image path
img_path = '/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/positive/000db696-cf54-4385-b10b-6b16fbb3f985.jpg' # positive
#img_path = '/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/negative/52836696-f108-4ada-b6f8-3de1221fb7ac.jpg' # negative
# load a single image
test_image = load_image(img_path)
# check prediction
prediction = model.predict(test_image)
print(prediction)
[[0.72756654]]
import keras
from tensorflow.keras import Model
from keras.applications.vgg16 import VGG16
from keras.applications.vgg16 import preprocess_input
from tensorflow.keras.preprocessing.image import ImageDataGenerator
vgg_model = VGG16(input_shape = (256,256,3),include_top = False,weights = 'imagenet')
output = vgg_model.layers[-1].output
output = keras.layers.Flatten()(output)
vgg_model = Model(vgg_model.input, output)
for layer in vgg_model.layers:
layer.trainable = False
vgg_model.summary()
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/vgg16/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5
58892288/58889256 [==============================] - 0s 0us/step
58900480/58889256 [==============================] - 0s 0us/step
Model: "model"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) [(None, 256, 256, 3)] 0
block1_conv1 (Conv2D) (None, 256, 256, 64) 1792
block1_conv2 (Conv2D) (None, 256, 256, 64) 36928
block1_pool (MaxPooling2D) (None, 128, 128, 64) 0
block2_conv1 (Conv2D) (None, 128, 128, 128) 73856
block2_conv2 (Conv2D) (None, 128, 128, 128) 147584
block2_pool (MaxPooling2D) (None, 64, 64, 128) 0
block3_conv1 (Conv2D) (None, 64, 64, 256) 295168
block3_conv2 (Conv2D) (None, 64, 64, 256) 590080
block3_conv3 (Conv2D) (None, 64, 64, 256) 590080
block3_pool (MaxPooling2D) (None, 32, 32, 256) 0
block4_conv1 (Conv2D) (None, 32, 32, 512) 1180160
block4_conv2 (Conv2D) (None, 32, 32, 512) 2359808
block4_conv3 (Conv2D) (None, 32, 32, 512) 2359808
block4_pool (MaxPooling2D) (None, 16, 16, 512) 0
block5_conv1 (Conv2D) (None, 16, 16, 512) 2359808
block5_conv2 (Conv2D) (None, 16, 16, 512) 2359808
block5_conv3 (Conv2D) (None, 16, 16, 512) 2359808
block5_pool (MaxPooling2D) (None, 8, 8, 512) 0
flatten (Flatten) (None, 32768) 0
=================================================================
Total params: 14,714,688
Trainable params: 0
Non-trainable params: 14,714,688
_________________________________________________________________
from keras.layers import Conv2D, MaxPooling2D, Flatten, Dense, Dropout, InputLayer
from keras.models import Sequential
from keras import optimizers
input_shape=(256,256,3)
vmodel = Sequential()
vmodel.add(vgg_model)
vmodel.add(Dense(512, activation='relu', input_dim=input_shape))
vmodel.add(Dropout(0.3))
vmodel.add(Dense(512, activation='relu'))
vmodel.add(Dropout(0.3))
vmodel.add(Dense(1, activation='sigmoid'))
vmodel.compile(optimizer = 'adam',loss = 'binary_crossentropy',metrics = ['accuracy'])
vmodel.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
model (Functional) (None, 32768) 14714688
dense (Dense) (None, 512) 16777728
dropout (Dropout) (None, 512) 0
dense_1 (Dense) (None, 512) 262656
dropout_1 (Dropout) (None, 512) 0
dense_2 (Dense) (None, 1) 513
=================================================================
Total params: 31,755,585
Trainable params: 17,040,897
Non-trainable params: 14,714,688
_________________________________________________________________
vmodel.fit(train,epochs=20,callbacks=[reduce_lr],steps_per_epoch=100,validation_data=test,class_weight=class_weight)
--------------------------------------------------------------------------- NameError Traceback (most recent call last) <ipython-input-4-6ef9d9b302c4> in <module>() ----> 1 vmodel.fit(train,epochs=20,callbacks=[reduce_lr],steps_per_epoch=100,validation_data=test,class_weight=class_weight) NameError: name 'train' is not defined
Saving the Model
vmodel.save(f'/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/model_vgg16.h5')
Plotting accuracy and validation accuracy
plt.figure(figsize=(40,30))
vval_acc=np.asarray(vmodel.history.history['val_accuracy'])*100
vacc=np.asarray(vmodel.history.history['accuracy'])*100
vacc=pd.DataFrame({'val_acc':vval_acc,'acc':vacc})
vacc.plot(figsize=(20,10),yticks=range(50,100,5))
<matplotlib.axes._subplots.AxesSubplot at 0x7fba28fba0d0>
<Figure size 2880x2160 with 0 Axes>
Plotting Loss and Validation Loss
vloss=vmodel.history.history['loss']
vval_loss=vmodel.history.history['val_loss']
vloss=pd.DataFrame({'val_loss':vval_loss,'loss':vloss})
vloss.plot(figsize=(20,10))
<matplotlib.axes._subplots.AxesSubplot at 0x7fba28eff110>
Model Testing
y=[]
test.reset()
for i in tqdm(range(4)):
_,target=test.__getitem__(i)
for j in target:
y.append(j)
100%|██████████| 4/4 [00:01<00:00, 2.36it/s]
test.reset()
y_predv=vmodel.predict(test)
predv=[]
for i in y_predv:
if i[0]>=0.5:
predv.append(1)
else:
predv.append(0)
ROC Curve
from sklearn.metrics import roc_curve,auc,precision_recall_curve,classification_report
print(classification_report(y,predv[:len(y)]))
precision recall f1-score support
0.0 0.96 0.83 0.89 96
1.0 0.64 0.91 0.75 32
accuracy 0.85 128
macro avg 0.80 0.87 0.82 128
weighted avg 0.88 0.85 0.86 128
plt.figure(figsize=(20,10))
fprv,tprv,_=roc_curve(y,y_predv[:len(y)])
area_under_curvev=auc(fprv,tprv)
print('The area under the curve is:',area_under_curvev)
# Plot area under curve
plt.plot(fprv,tprv,'b.-')
plt.xlabel('false positive rate')
plt.ylabel('true positive rate')
plt.plot(fprv,fprv,linestyle='--',color='black')
The area under the curve is: 0.9208984375
[<matplotlib.lines.Line2D at 0x7fbaec390e10>]
from tensorflow.keras.applications.resnet import ResNet50
import keras
resnet_model = ResNet50(input_shape = (256,256,3), include_top = False, weights = 'imagenet')
output = resnet_model.layers[-1].output
output = keras.layers.Flatten()(output)
resnet_model = Model(resnet_model.input, output)
for layer in resnet_model.layers:
layer.trainable = False
resnet_model.summary()
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/resnet/resnet50_weights_tf_dim_ordering_tf_kernels_notop.h5
94773248/94765736 [==============================] - 1s 0us/step
94781440/94765736 [==============================] - 1s 0us/step
Model: "model_2"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_3 (InputLayer) [(None, 256, 256, 3) 0
__________________________________________________________________________________________________
conv1_pad (ZeroPadding2D) (None, 262, 262, 3) 0 input_3[0][0]
__________________________________________________________________________________________________
conv1_conv (Conv2D) (None, 128, 128, 64) 9472 conv1_pad[0][0]
__________________________________________________________________________________________________
conv1_bn (BatchNormalization) (None, 128, 128, 64) 256 conv1_conv[0][0]
__________________________________________________________________________________________________
conv1_relu (Activation) (None, 128, 128, 64) 0 conv1_bn[0][0]
__________________________________________________________________________________________________
pool1_pad (ZeroPadding2D) (None, 130, 130, 64) 0 conv1_relu[0][0]
__________________________________________________________________________________________________
pool1_pool (MaxPooling2D) (None, 64, 64, 64) 0 pool1_pad[0][0]
__________________________________________________________________________________________________
conv2_block1_1_conv (Conv2D) (None, 64, 64, 64) 4160 pool1_pool[0][0]
__________________________________________________________________________________________________
conv2_block1_1_bn (BatchNormali (None, 64, 64, 64) 256 conv2_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv2_block1_1_relu (Activation (None, 64, 64, 64) 0 conv2_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv2_block1_2_conv (Conv2D) (None, 64, 64, 64) 36928 conv2_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv2_block1_2_bn (BatchNormali (None, 64, 64, 64) 256 conv2_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv2_block1_2_relu (Activation (None, 64, 64, 64) 0 conv2_block1_2_bn[0][0]
__________________________________________________________________________________________________
conv2_block1_0_conv (Conv2D) (None, 64, 64, 256) 16640 pool1_pool[0][0]
__________________________________________________________________________________________________
conv2_block1_3_conv (Conv2D) (None, 64, 64, 256) 16640 conv2_block1_2_relu[0][0]
__________________________________________________________________________________________________
conv2_block1_0_bn (BatchNormali (None, 64, 64, 256) 1024 conv2_block1_0_conv[0][0]
__________________________________________________________________________________________________
conv2_block1_3_bn (BatchNormali (None, 64, 64, 256) 1024 conv2_block1_3_conv[0][0]
__________________________________________________________________________________________________
conv2_block1_add (Add) (None, 64, 64, 256) 0 conv2_block1_0_bn[0][0]
conv2_block1_3_bn[0][0]
__________________________________________________________________________________________________
conv2_block1_out (Activation) (None, 64, 64, 256) 0 conv2_block1_add[0][0]
__________________________________________________________________________________________________
conv2_block2_1_conv (Conv2D) (None, 64, 64, 64) 16448 conv2_block1_out[0][0]
__________________________________________________________________________________________________
conv2_block2_1_bn (BatchNormali (None, 64, 64, 64) 256 conv2_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv2_block2_1_relu (Activation (None, 64, 64, 64) 0 conv2_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv2_block2_2_conv (Conv2D) (None, 64, 64, 64) 36928 conv2_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv2_block2_2_bn (BatchNormali (None, 64, 64, 64) 256 conv2_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv2_block2_2_relu (Activation (None, 64, 64, 64) 0 conv2_block2_2_bn[0][0]
__________________________________________________________________________________________________
conv2_block2_3_conv (Conv2D) (None, 64, 64, 256) 16640 conv2_block2_2_relu[0][0]
__________________________________________________________________________________________________
conv2_block2_3_bn (BatchNormali (None, 64, 64, 256) 1024 conv2_block2_3_conv[0][0]
__________________________________________________________________________________________________
conv2_block2_add (Add) (None, 64, 64, 256) 0 conv2_block1_out[0][0]
conv2_block2_3_bn[0][0]
__________________________________________________________________________________________________
conv2_block2_out (Activation) (None, 64, 64, 256) 0 conv2_block2_add[0][0]
__________________________________________________________________________________________________
conv2_block3_1_conv (Conv2D) (None, 64, 64, 64) 16448 conv2_block2_out[0][0]
__________________________________________________________________________________________________
conv2_block3_1_bn (BatchNormali (None, 64, 64, 64) 256 conv2_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv2_block3_1_relu (Activation (None, 64, 64, 64) 0 conv2_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv2_block3_2_conv (Conv2D) (None, 64, 64, 64) 36928 conv2_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv2_block3_2_bn (BatchNormali (None, 64, 64, 64) 256 conv2_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv2_block3_2_relu (Activation (None, 64, 64, 64) 0 conv2_block3_2_bn[0][0]
__________________________________________________________________________________________________
conv2_block3_3_conv (Conv2D) (None, 64, 64, 256) 16640 conv2_block3_2_relu[0][0]
__________________________________________________________________________________________________
conv2_block3_3_bn (BatchNormali (None, 64, 64, 256) 1024 conv2_block3_3_conv[0][0]
__________________________________________________________________________________________________
conv2_block3_add (Add) (None, 64, 64, 256) 0 conv2_block2_out[0][0]
conv2_block3_3_bn[0][0]
__________________________________________________________________________________________________
conv2_block3_out (Activation) (None, 64, 64, 256) 0 conv2_block3_add[0][0]
__________________________________________________________________________________________________
conv3_block1_1_conv (Conv2D) (None, 32, 32, 128) 32896 conv2_block3_out[0][0]
__________________________________________________________________________________________________
conv3_block1_1_bn (BatchNormali (None, 32, 32, 128) 512 conv3_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block1_1_relu (Activation (None, 32, 32, 128) 0 conv3_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block1_2_conv (Conv2D) (None, 32, 32, 128) 147584 conv3_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block1_2_bn (BatchNormali (None, 32, 32, 128) 512 conv3_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block1_2_relu (Activation (None, 32, 32, 128) 0 conv3_block1_2_bn[0][0]
__________________________________________________________________________________________________
conv3_block1_0_conv (Conv2D) (None, 32, 32, 512) 131584 conv2_block3_out[0][0]
__________________________________________________________________________________________________
conv3_block1_3_conv (Conv2D) (None, 32, 32, 512) 66048 conv3_block1_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block1_0_bn (BatchNormali (None, 32, 32, 512) 2048 conv3_block1_0_conv[0][0]
__________________________________________________________________________________________________
conv3_block1_3_bn (BatchNormali (None, 32, 32, 512) 2048 conv3_block1_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block1_add (Add) (None, 32, 32, 512) 0 conv3_block1_0_bn[0][0]
conv3_block1_3_bn[0][0]
__________________________________________________________________________________________________
conv3_block1_out (Activation) (None, 32, 32, 512) 0 conv3_block1_add[0][0]
__________________________________________________________________________________________________
conv3_block2_1_conv (Conv2D) (None, 32, 32, 128) 65664 conv3_block1_out[0][0]
__________________________________________________________________________________________________
conv3_block2_1_bn (BatchNormali (None, 32, 32, 128) 512 conv3_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block2_1_relu (Activation (None, 32, 32, 128) 0 conv3_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block2_2_conv (Conv2D) (None, 32, 32, 128) 147584 conv3_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block2_2_bn (BatchNormali (None, 32, 32, 128) 512 conv3_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block2_2_relu (Activation (None, 32, 32, 128) 0 conv3_block2_2_bn[0][0]
__________________________________________________________________________________________________
conv3_block2_3_conv (Conv2D) (None, 32, 32, 512) 66048 conv3_block2_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block2_3_bn (BatchNormali (None, 32, 32, 512) 2048 conv3_block2_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block2_add (Add) (None, 32, 32, 512) 0 conv3_block1_out[0][0]
conv3_block2_3_bn[0][0]
__________________________________________________________________________________________________
conv3_block2_out (Activation) (None, 32, 32, 512) 0 conv3_block2_add[0][0]
__________________________________________________________________________________________________
conv3_block3_1_conv (Conv2D) (None, 32, 32, 128) 65664 conv3_block2_out[0][0]
__________________________________________________________________________________________________
conv3_block3_1_bn (BatchNormali (None, 32, 32, 128) 512 conv3_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block3_1_relu (Activation (None, 32, 32, 128) 0 conv3_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block3_2_conv (Conv2D) (None, 32, 32, 128) 147584 conv3_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block3_2_bn (BatchNormali (None, 32, 32, 128) 512 conv3_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block3_2_relu (Activation (None, 32, 32, 128) 0 conv3_block3_2_bn[0][0]
__________________________________________________________________________________________________
conv3_block3_3_conv (Conv2D) (None, 32, 32, 512) 66048 conv3_block3_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block3_3_bn (BatchNormali (None, 32, 32, 512) 2048 conv3_block3_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block3_add (Add) (None, 32, 32, 512) 0 conv3_block2_out[0][0]
conv3_block3_3_bn[0][0]
__________________________________________________________________________________________________
conv3_block3_out (Activation) (None, 32, 32, 512) 0 conv3_block3_add[0][0]
__________________________________________________________________________________________________
conv3_block4_1_conv (Conv2D) (None, 32, 32, 128) 65664 conv3_block3_out[0][0]
__________________________________________________________________________________________________
conv3_block4_1_bn (BatchNormali (None, 32, 32, 128) 512 conv3_block4_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block4_1_relu (Activation (None, 32, 32, 128) 0 conv3_block4_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block4_2_conv (Conv2D) (None, 32, 32, 128) 147584 conv3_block4_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block4_2_bn (BatchNormali (None, 32, 32, 128) 512 conv3_block4_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block4_2_relu (Activation (None, 32, 32, 128) 0 conv3_block4_2_bn[0][0]
__________________________________________________________________________________________________
conv3_block4_3_conv (Conv2D) (None, 32, 32, 512) 66048 conv3_block4_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block4_3_bn (BatchNormali (None, 32, 32, 512) 2048 conv3_block4_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block4_add (Add) (None, 32, 32, 512) 0 conv3_block3_out[0][0]
conv3_block4_3_bn[0][0]
__________________________________________________________________________________________________
conv3_block4_out (Activation) (None, 32, 32, 512) 0 conv3_block4_add[0][0]
__________________________________________________________________________________________________
conv4_block1_1_conv (Conv2D) (None, 16, 16, 256) 131328 conv3_block4_out[0][0]
__________________________________________________________________________________________________
conv4_block1_1_bn (BatchNormali (None, 16, 16, 256) 1024 conv4_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block1_1_relu (Activation (None, 16, 16, 256) 0 conv4_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block1_2_conv (Conv2D) (None, 16, 16, 256) 590080 conv4_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block1_2_bn (BatchNormali (None, 16, 16, 256) 1024 conv4_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block1_2_relu (Activation (None, 16, 16, 256) 0 conv4_block1_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block1_0_conv (Conv2D) (None, 16, 16, 1024) 525312 conv3_block4_out[0][0]
__________________________________________________________________________________________________
conv4_block1_3_conv (Conv2D) (None, 16, 16, 1024) 263168 conv4_block1_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block1_0_bn (BatchNormali (None, 16, 16, 1024) 4096 conv4_block1_0_conv[0][0]
__________________________________________________________________________________________________
conv4_block1_3_bn (BatchNormali (None, 16, 16, 1024) 4096 conv4_block1_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block1_add (Add) (None, 16, 16, 1024) 0 conv4_block1_0_bn[0][0]
conv4_block1_3_bn[0][0]
__________________________________________________________________________________________________
conv4_block1_out (Activation) (None, 16, 16, 1024) 0 conv4_block1_add[0][0]
__________________________________________________________________________________________________
conv4_block2_1_conv (Conv2D) (None, 16, 16, 256) 262400 conv4_block1_out[0][0]
__________________________________________________________________________________________________
conv4_block2_1_bn (BatchNormali (None, 16, 16, 256) 1024 conv4_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block2_1_relu (Activation (None, 16, 16, 256) 0 conv4_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block2_2_conv (Conv2D) (None, 16, 16, 256) 590080 conv4_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block2_2_bn (BatchNormali (None, 16, 16, 256) 1024 conv4_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block2_2_relu (Activation (None, 16, 16, 256) 0 conv4_block2_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block2_3_conv (Conv2D) (None, 16, 16, 1024) 263168 conv4_block2_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block2_3_bn (BatchNormali (None, 16, 16, 1024) 4096 conv4_block2_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block2_add (Add) (None, 16, 16, 1024) 0 conv4_block1_out[0][0]
conv4_block2_3_bn[0][0]
__________________________________________________________________________________________________
conv4_block2_out (Activation) (None, 16, 16, 1024) 0 conv4_block2_add[0][0]
__________________________________________________________________________________________________
conv4_block3_1_conv (Conv2D) (None, 16, 16, 256) 262400 conv4_block2_out[0][0]
__________________________________________________________________________________________________
conv4_block3_1_bn (BatchNormali (None, 16, 16, 256) 1024 conv4_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block3_1_relu (Activation (None, 16, 16, 256) 0 conv4_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block3_2_conv (Conv2D) (None, 16, 16, 256) 590080 conv4_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block3_2_bn (BatchNormali (None, 16, 16, 256) 1024 conv4_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block3_2_relu (Activation (None, 16, 16, 256) 0 conv4_block3_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block3_3_conv (Conv2D) (None, 16, 16, 1024) 263168 conv4_block3_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block3_3_bn (BatchNormali (None, 16, 16, 1024) 4096 conv4_block3_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block3_add (Add) (None, 16, 16, 1024) 0 conv4_block2_out[0][0]
conv4_block3_3_bn[0][0]
__________________________________________________________________________________________________
conv4_block3_out (Activation) (None, 16, 16, 1024) 0 conv4_block3_add[0][0]
__________________________________________________________________________________________________
conv4_block4_1_conv (Conv2D) (None, 16, 16, 256) 262400 conv4_block3_out[0][0]
__________________________________________________________________________________________________
conv4_block4_1_bn (BatchNormali (None, 16, 16, 256) 1024 conv4_block4_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block4_1_relu (Activation (None, 16, 16, 256) 0 conv4_block4_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block4_2_conv (Conv2D) (None, 16, 16, 256) 590080 conv4_block4_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block4_2_bn (BatchNormali (None, 16, 16, 256) 1024 conv4_block4_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block4_2_relu (Activation (None, 16, 16, 256) 0 conv4_block4_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block4_3_conv (Conv2D) (None, 16, 16, 1024) 263168 conv4_block4_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block4_3_bn (BatchNormali (None, 16, 16, 1024) 4096 conv4_block4_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block4_add (Add) (None, 16, 16, 1024) 0 conv4_block3_out[0][0]
conv4_block4_3_bn[0][0]
__________________________________________________________________________________________________
conv4_block4_out (Activation) (None, 16, 16, 1024) 0 conv4_block4_add[0][0]
__________________________________________________________________________________________________
conv4_block5_1_conv (Conv2D) (None, 16, 16, 256) 262400 conv4_block4_out[0][0]
__________________________________________________________________________________________________
conv4_block5_1_bn (BatchNormali (None, 16, 16, 256) 1024 conv4_block5_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block5_1_relu (Activation (None, 16, 16, 256) 0 conv4_block5_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block5_2_conv (Conv2D) (None, 16, 16, 256) 590080 conv4_block5_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block5_2_bn (BatchNormali (None, 16, 16, 256) 1024 conv4_block5_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block5_2_relu (Activation (None, 16, 16, 256) 0 conv4_block5_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block5_3_conv (Conv2D) (None, 16, 16, 1024) 263168 conv4_block5_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block5_3_bn (BatchNormali (None, 16, 16, 1024) 4096 conv4_block5_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block5_add (Add) (None, 16, 16, 1024) 0 conv4_block4_out[0][0]
conv4_block5_3_bn[0][0]
__________________________________________________________________________________________________
conv4_block5_out (Activation) (None, 16, 16, 1024) 0 conv4_block5_add[0][0]
__________________________________________________________________________________________________
conv4_block6_1_conv (Conv2D) (None, 16, 16, 256) 262400 conv4_block5_out[0][0]
__________________________________________________________________________________________________
conv4_block6_1_bn (BatchNormali (None, 16, 16, 256) 1024 conv4_block6_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block6_1_relu (Activation (None, 16, 16, 256) 0 conv4_block6_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block6_2_conv (Conv2D) (None, 16, 16, 256) 590080 conv4_block6_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block6_2_bn (BatchNormali (None, 16, 16, 256) 1024 conv4_block6_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block6_2_relu (Activation (None, 16, 16, 256) 0 conv4_block6_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block6_3_conv (Conv2D) (None, 16, 16, 1024) 263168 conv4_block6_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block6_3_bn (BatchNormali (None, 16, 16, 1024) 4096 conv4_block6_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block6_add (Add) (None, 16, 16, 1024) 0 conv4_block5_out[0][0]
conv4_block6_3_bn[0][0]
__________________________________________________________________________________________________
conv4_block6_out (Activation) (None, 16, 16, 1024) 0 conv4_block6_add[0][0]
__________________________________________________________________________________________________
conv5_block1_1_conv (Conv2D) (None, 8, 8, 512) 524800 conv4_block6_out[0][0]
__________________________________________________________________________________________________
conv5_block1_1_bn (BatchNormali (None, 8, 8, 512) 2048 conv5_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block1_1_relu (Activation (None, 8, 8, 512) 0 conv5_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block1_2_conv (Conv2D) (None, 8, 8, 512) 2359808 conv5_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block1_2_bn (BatchNormali (None, 8, 8, 512) 2048 conv5_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block1_2_relu (Activation (None, 8, 8, 512) 0 conv5_block1_2_bn[0][0]
__________________________________________________________________________________________________
conv5_block1_0_conv (Conv2D) (None, 8, 8, 2048) 2099200 conv4_block6_out[0][0]
__________________________________________________________________________________________________
conv5_block1_3_conv (Conv2D) (None, 8, 8, 2048) 1050624 conv5_block1_2_relu[0][0]
__________________________________________________________________________________________________
conv5_block1_0_bn (BatchNormali (None, 8, 8, 2048) 8192 conv5_block1_0_conv[0][0]
__________________________________________________________________________________________________
conv5_block1_3_bn (BatchNormali (None, 8, 8, 2048) 8192 conv5_block1_3_conv[0][0]
__________________________________________________________________________________________________
conv5_block1_add (Add) (None, 8, 8, 2048) 0 conv5_block1_0_bn[0][0]
conv5_block1_3_bn[0][0]
__________________________________________________________________________________________________
conv5_block1_out (Activation) (None, 8, 8, 2048) 0 conv5_block1_add[0][0]
__________________________________________________________________________________________________
conv5_block2_1_conv (Conv2D) (None, 8, 8, 512) 1049088 conv5_block1_out[0][0]
__________________________________________________________________________________________________
conv5_block2_1_bn (BatchNormali (None, 8, 8, 512) 2048 conv5_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block2_1_relu (Activation (None, 8, 8, 512) 0 conv5_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block2_2_conv (Conv2D) (None, 8, 8, 512) 2359808 conv5_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block2_2_bn (BatchNormali (None, 8, 8, 512) 2048 conv5_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block2_2_relu (Activation (None, 8, 8, 512) 0 conv5_block2_2_bn[0][0]
__________________________________________________________________________________________________
conv5_block2_3_conv (Conv2D) (None, 8, 8, 2048) 1050624 conv5_block2_2_relu[0][0]
__________________________________________________________________________________________________
conv5_block2_3_bn (BatchNormali (None, 8, 8, 2048) 8192 conv5_block2_3_conv[0][0]
__________________________________________________________________________________________________
conv5_block2_add (Add) (None, 8, 8, 2048) 0 conv5_block1_out[0][0]
conv5_block2_3_bn[0][0]
__________________________________________________________________________________________________
conv5_block2_out (Activation) (None, 8, 8, 2048) 0 conv5_block2_add[0][0]
__________________________________________________________________________________________________
conv5_block3_1_conv (Conv2D) (None, 8, 8, 512) 1049088 conv5_block2_out[0][0]
__________________________________________________________________________________________________
conv5_block3_1_bn (BatchNormali (None, 8, 8, 512) 2048 conv5_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block3_1_relu (Activation (None, 8, 8, 512) 0 conv5_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block3_2_conv (Conv2D) (None, 8, 8, 512) 2359808 conv5_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block3_2_bn (BatchNormali (None, 8, 8, 512) 2048 conv5_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block3_2_relu (Activation (None, 8, 8, 512) 0 conv5_block3_2_bn[0][0]
__________________________________________________________________________________________________
conv5_block3_3_conv (Conv2D) (None, 8, 8, 2048) 1050624 conv5_block3_2_relu[0][0]
__________________________________________________________________________________________________
conv5_block3_3_bn (BatchNormali (None, 8, 8, 2048) 8192 conv5_block3_3_conv[0][0]
__________________________________________________________________________________________________
conv5_block3_add (Add) (None, 8, 8, 2048) 0 conv5_block2_out[0][0]
conv5_block3_3_bn[0][0]
__________________________________________________________________________________________________
conv5_block3_out (Activation) (None, 8, 8, 2048) 0 conv5_block3_add[0][0]
__________________________________________________________________________________________________
flatten_2 (Flatten) (None, 131072) 0 conv5_block3_out[0][0]
==================================================================================================
Total params: 23,587,712
Trainable params: 0
Non-trainable params: 23,587,712
__________________________________________________________________________________________________
Model Compilation
from keras.layers import Conv2D, MaxPooling2D, Flatten, Dense, Dropout, InputLayer
from keras.models import Sequential
from keras import optimizers
input_shape=(256,256,3)
rmodel = Sequential()
rmodel.add(resnet_model)
rmodel.add(Dense(512, activation='relu', input_dim=input_shape))
rmodel.add(Dropout(0.3))
rmodel.add(Dense(512, activation='relu'))
rmodel.add(Dropout(0.3))
rmodel.add(Dense(1, activation='sigmoid'))
rmodel.compile(optimizer = 'adam',
loss = 'binary_crossentropy',
metrics = ['accuracy'])
rmodel.summary()
Model: "sequential_1" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= model_2 (Functional) (None, 131072) 23587712 _________________________________________________________________ dense_6 (Dense) (None, 512) 67109376 _________________________________________________________________ dropout_4 (Dropout) (None, 512) 0 _________________________________________________________________ dense_7 (Dense) (None, 512) 262656 _________________________________________________________________ dropout_5 (Dropout) (None, 512) 0 _________________________________________________________________ dense_8 (Dense) (None, 1) 513 ================================================================= Total params: 90,960,257 Trainable params: 67,372,545 Non-trainable params: 23,587,712 _________________________________________________________________
Model Training
rmodel.fit(train,epochs=20,callbacks=[reduce_lr],steps_per_epoch=100,validation_data=test,class_weight=class_weight)
Epoch 1/20 100/100 [==============================] - 218s 2s/step - loss: 3.7895 - accuracy: 0.5874 - val_loss: 0.5534 - val_accuracy: 0.7335 Epoch 2/20 100/100 [==============================] - 215s 2s/step - loss: 0.9328 - accuracy: 0.6418 - val_loss: 0.5404 - val_accuracy: 0.6964 Epoch 3/20 100/100 [==============================] - 214s 2s/step - loss: 0.9504 - accuracy: 0.5638 - val_loss: 0.5331 - val_accuracy: 0.7106 Epoch 4/20 100/100 [==============================] - 215s 2s/step - loss: 0.9541 - accuracy: 0.6431 - val_loss: 0.6721 - val_accuracy: 0.7526 Epoch 5/20 100/100 [==============================] - 216s 2s/step - loss: 1.0215 - accuracy: 0.7566 - val_loss: 0.6358 - val_accuracy: 0.7954 Epoch 6/20 100/100 [==============================] - 216s 2s/step - loss: 0.9964 - accuracy: 0.7538 - val_loss: 0.5614 - val_accuracy: 0.7155 Epoch 7/20 100/100 [==============================] - 215s 2s/step - loss: 0.9575 - accuracy: 0.7548 - val_loss: 0.5789 - val_accuracy: 0.8220 Epoch 00007: ReduceLROnPlateau reducing learning rate to 0.0005000000237487257. Epoch 8/20 100/100 [==============================] - 214s 2s/step - loss: 0.9714 - accuracy: 0.7561 - val_loss: 0.5755 - val_accuracy: 0.8081 Epoch 9/20 100/100 [==============================] - 214s 2s/step - loss: 0.9601 - accuracy: 0.7513 - val_loss: 0.5884 - val_accuracy: 0.7747 Epoch 00009: ReduceLROnPlateau reducing learning rate to 0.0002500000118743628. Epoch 10/20 100/100 [==============================] - 216s 2s/step - loss: 0.9229 - accuracy: 0.7502 - val_loss: 0.5737 - val_accuracy: 0.7650 Epoch 11/20 100/100 [==============================] - 217s 2s/step - loss: 0.9325 - accuracy: 0.7341 - val_loss: 0.5777 - val_accuracy: 0.7493 Epoch 00011: ReduceLROnPlateau reducing learning rate to 0.0001250000059371814. Epoch 12/20 100/100 [==============================] - 214s 2s/step - loss: 0.9257 - accuracy: 0.7396 - val_loss: 0.5612 - val_accuracy: 0.7725 Epoch 13/20 100/100 [==============================] - 215s 2s/step - loss: 0.9063 - accuracy: 0.7462 - val_loss: 0.5457 - val_accuracy: 0.7916 Epoch 00013: ReduceLROnPlateau reducing learning rate to 6.25000029685907e-05. Epoch 14/20 100/100 [==============================] - 214s 2s/step - loss: 0.9039 - accuracy: 0.7494 - val_loss: 0.5479 - val_accuracy: 0.7894 Epoch 15/20 100/100 [==============================] - 214s 2s/step - loss: 0.9062 - accuracy: 0.7363 - val_loss: 0.5521 - val_accuracy: 0.7732 Epoch 00015: ReduceLROnPlateau reducing learning rate to 3.125000148429535e-05. Epoch 16/20 100/100 [==============================] - 214s 2s/step - loss: 0.8966 - accuracy: 0.7366 - val_loss: 0.5509 - val_accuracy: 0.7804 Epoch 17/20 100/100 [==============================] - 217s 2s/step - loss: 0.8995 - accuracy: 0.7393 - val_loss: 0.5502 - val_accuracy: 0.7789 Epoch 00017: ReduceLROnPlateau reducing learning rate to 1.5625000742147677e-05. Epoch 18/20 100/100 [==============================] - 219s 2s/step - loss: 0.8955 - accuracy: 0.7408 - val_loss: 0.5392 - val_accuracy: 0.7939 Epoch 19/20 100/100 [==============================] - 219s 2s/step - loss: 0.8826 - accuracy: 0.7422 - val_loss: 0.5520 - val_accuracy: 0.7819 Epoch 00019: ReduceLROnPlateau reducing learning rate to 7.812500371073838e-06. Epoch 20/20 100/100 [==============================] - 221s 2s/step - loss: 0.8902 - accuracy: 0.7390 - val_loss: 0.5426 - val_accuracy: 0.7852
<keras.callbacks.History at 0x7fba28afdd50>
Saving the Model
rmodel.save(f'/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/model_resnet50.h5')
/usr/local/lib/python3.7/dist-packages/keras/utils/generic_utils.py:497: CustomMaskWarning: Custom mask layers require a config and must override get_config. When loading, the custom mask layer must be passed to the custom_objects argument. category=CustomMaskWarning)
Plotting & Validation Accuracy
plt.figure(figsize=(30,20))
rval_acc=np.asarray(rmodel.history.history['val_accuracy'])*100
racc=np.asarray(rmodel.history.history['accuracy'])*100
racc=pd.DataFrame({'val_acc':rval_acc,'acc':racc})
racc.plot(figsize=(20,10),yticks=range(50,100,5))
<matplotlib.axes._subplots.AxesSubplot at 0x7fba24400190>
<Figure size 2160x1440 with 0 Axes>
Plotting & Validation Loss
rloss=rmodel.history.history['loss']
rval_loss=rmodel.history.history['val_loss']
rloss=pd.DataFrame({'val_loss':rval_loss,'loss':rloss})
rloss.plot(figsize=(20,10))
<matplotlib.axes._subplots.AxesSubplot at 0x7fba28dd2190>
Model Testing
y=[]
test.reset()
for i in tqdm(range(4)):
_,target=test.__getitem__(i)
for j in target:
y.append(j)
100%|██████████| 4/4 [00:01<00:00, 2.28it/s]
test.reset()
y_predr=rmodel.predict(test)
predr=[]
for i in y_predr:
if i[0]>=0.5:
predr.append(1)
else:
predr.append(0)
Classification Report and ROC Curve
print(classification_report(y,predr[:len(y)]))
precision recall f1-score support
0.0 0.98 0.80 0.88 105
1.0 0.50 0.91 0.65 23
accuracy 0.82 128
macro avg 0.74 0.86 0.76 128
weighted avg 0.89 0.82 0.84 128
plt.figure(figsize=(20,10))
fprr,tprr,_=roc_curve(y,y_predr[:len(y)])
area_under_curver=auc(fprr,tprr)
print('The area under the curve is:',area_under_curver)
# Plot area under curve
plt.plot(fprr,tprr,'b.-')
plt.xlabel('false positive rate')
plt.ylabel('true positive rate')
plt.plot(fprr,fprr,linestyle='--',color='black')
The area under the curve is: 0.9055900621118012
[<matplotlib.lines.Line2D at 0x7fba24c80190>]
from tensorflow.keras.applications.inception_v3 import InceptionV3
inc_model = InceptionV3(input_shape = (256,256,3),
include_top = False,
weights = 'imagenet')
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/inception_v3/inception_v3_weights_tf_dim_ordering_tf_kernels_notop.h5 87916544/87910968 [==============================] - 4s 0us/step 87924736/87910968 [==============================] - 4s 0us/step
import keras
import keras.utils
from keras import utils as np_utils
output = inc_model.layers[-1].output
output = keras.layers.Flatten()(output)
inc_model = Model(inc_model.input, output)
for layer in inc_model.layers:
layer.trainable = False
inc_model.summary()
Model: "model"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 256, 256, 3 0 []
)]
conv2d (Conv2D) (None, 127, 127, 32 864 ['input_1[0][0]']
)
batch_normalization (BatchNorm (None, 127, 127, 32 96 ['conv2d[0][0]']
alization) )
activation (Activation) (None, 127, 127, 32 0 ['batch_normalization[0][0]']
)
conv2d_1 (Conv2D) (None, 125, 125, 32 9216 ['activation[0][0]']
)
batch_normalization_1 (BatchNo (None, 125, 125, 32 96 ['conv2d_1[0][0]']
rmalization) )
activation_1 (Activation) (None, 125, 125, 32 0 ['batch_normalization_1[0][0]']
)
conv2d_2 (Conv2D) (None, 125, 125, 64 18432 ['activation_1[0][0]']
)
batch_normalization_2 (BatchNo (None, 125, 125, 64 192 ['conv2d_2[0][0]']
rmalization) )
activation_2 (Activation) (None, 125, 125, 64 0 ['batch_normalization_2[0][0]']
)
max_pooling2d (MaxPooling2D) (None, 62, 62, 64) 0 ['activation_2[0][0]']
conv2d_3 (Conv2D) (None, 62, 62, 80) 5120 ['max_pooling2d[0][0]']
batch_normalization_3 (BatchNo (None, 62, 62, 80) 240 ['conv2d_3[0][0]']
rmalization)
activation_3 (Activation) (None, 62, 62, 80) 0 ['batch_normalization_3[0][0]']
conv2d_4 (Conv2D) (None, 60, 60, 192) 138240 ['activation_3[0][0]']
batch_normalization_4 (BatchNo (None, 60, 60, 192) 576 ['conv2d_4[0][0]']
rmalization)
activation_4 (Activation) (None, 60, 60, 192) 0 ['batch_normalization_4[0][0]']
max_pooling2d_1 (MaxPooling2D) (None, 29, 29, 192) 0 ['activation_4[0][0]']
conv2d_8 (Conv2D) (None, 29, 29, 64) 12288 ['max_pooling2d_1[0][0]']
batch_normalization_8 (BatchNo (None, 29, 29, 64) 192 ['conv2d_8[0][0]']
rmalization)
activation_8 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_8[0][0]']
conv2d_6 (Conv2D) (None, 29, 29, 48) 9216 ['max_pooling2d_1[0][0]']
conv2d_9 (Conv2D) (None, 29, 29, 96) 55296 ['activation_8[0][0]']
batch_normalization_6 (BatchNo (None, 29, 29, 48) 144 ['conv2d_6[0][0]']
rmalization)
batch_normalization_9 (BatchNo (None, 29, 29, 96) 288 ['conv2d_9[0][0]']
rmalization)
activation_6 (Activation) (None, 29, 29, 48) 0 ['batch_normalization_6[0][0]']
activation_9 (Activation) (None, 29, 29, 96) 0 ['batch_normalization_9[0][0]']
average_pooling2d (AveragePool (None, 29, 29, 192) 0 ['max_pooling2d_1[0][0]']
ing2D)
conv2d_5 (Conv2D) (None, 29, 29, 64) 12288 ['max_pooling2d_1[0][0]']
conv2d_7 (Conv2D) (None, 29, 29, 64) 76800 ['activation_6[0][0]']
conv2d_10 (Conv2D) (None, 29, 29, 96) 82944 ['activation_9[0][0]']
conv2d_11 (Conv2D) (None, 29, 29, 32) 6144 ['average_pooling2d[0][0]']
batch_normalization_5 (BatchNo (None, 29, 29, 64) 192 ['conv2d_5[0][0]']
rmalization)
batch_normalization_7 (BatchNo (None, 29, 29, 64) 192 ['conv2d_7[0][0]']
rmalization)
batch_normalization_10 (BatchN (None, 29, 29, 96) 288 ['conv2d_10[0][0]']
ormalization)
batch_normalization_11 (BatchN (None, 29, 29, 32) 96 ['conv2d_11[0][0]']
ormalization)
activation_5 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_5[0][0]']
activation_7 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_7[0][0]']
activation_10 (Activation) (None, 29, 29, 96) 0 ['batch_normalization_10[0][0]']
activation_11 (Activation) (None, 29, 29, 32) 0 ['batch_normalization_11[0][0]']
mixed0 (Concatenate) (None, 29, 29, 256) 0 ['activation_5[0][0]',
'activation_7[0][0]',
'activation_10[0][0]',
'activation_11[0][0]']
conv2d_15 (Conv2D) (None, 29, 29, 64) 16384 ['mixed0[0][0]']
batch_normalization_15 (BatchN (None, 29, 29, 64) 192 ['conv2d_15[0][0]']
ormalization)
activation_15 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_15[0][0]']
conv2d_13 (Conv2D) (None, 29, 29, 48) 12288 ['mixed0[0][0]']
conv2d_16 (Conv2D) (None, 29, 29, 96) 55296 ['activation_15[0][0]']
batch_normalization_13 (BatchN (None, 29, 29, 48) 144 ['conv2d_13[0][0]']
ormalization)
batch_normalization_16 (BatchN (None, 29, 29, 96) 288 ['conv2d_16[0][0]']
ormalization)
activation_13 (Activation) (None, 29, 29, 48) 0 ['batch_normalization_13[0][0]']
activation_16 (Activation) (None, 29, 29, 96) 0 ['batch_normalization_16[0][0]']
average_pooling2d_1 (AveragePo (None, 29, 29, 256) 0 ['mixed0[0][0]']
oling2D)
conv2d_12 (Conv2D) (None, 29, 29, 64) 16384 ['mixed0[0][0]']
conv2d_14 (Conv2D) (None, 29, 29, 64) 76800 ['activation_13[0][0]']
conv2d_17 (Conv2D) (None, 29, 29, 96) 82944 ['activation_16[0][0]']
conv2d_18 (Conv2D) (None, 29, 29, 64) 16384 ['average_pooling2d_1[0][0]']
batch_normalization_12 (BatchN (None, 29, 29, 64) 192 ['conv2d_12[0][0]']
ormalization)
batch_normalization_14 (BatchN (None, 29, 29, 64) 192 ['conv2d_14[0][0]']
ormalization)
batch_normalization_17 (BatchN (None, 29, 29, 96) 288 ['conv2d_17[0][0]']
ormalization)
batch_normalization_18 (BatchN (None, 29, 29, 64) 192 ['conv2d_18[0][0]']
ormalization)
activation_12 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_12[0][0]']
activation_14 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_14[0][0]']
activation_17 (Activation) (None, 29, 29, 96) 0 ['batch_normalization_17[0][0]']
activation_18 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_18[0][0]']
mixed1 (Concatenate) (None, 29, 29, 288) 0 ['activation_12[0][0]',
'activation_14[0][0]',
'activation_17[0][0]',
'activation_18[0][0]']
conv2d_22 (Conv2D) (None, 29, 29, 64) 18432 ['mixed1[0][0]']
batch_normalization_22 (BatchN (None, 29, 29, 64) 192 ['conv2d_22[0][0]']
ormalization)
activation_22 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_22[0][0]']
conv2d_20 (Conv2D) (None, 29, 29, 48) 13824 ['mixed1[0][0]']
conv2d_23 (Conv2D) (None, 29, 29, 96) 55296 ['activation_22[0][0]']
batch_normalization_20 (BatchN (None, 29, 29, 48) 144 ['conv2d_20[0][0]']
ormalization)
batch_normalization_23 (BatchN (None, 29, 29, 96) 288 ['conv2d_23[0][0]']
ormalization)
activation_20 (Activation) (None, 29, 29, 48) 0 ['batch_normalization_20[0][0]']
activation_23 (Activation) (None, 29, 29, 96) 0 ['batch_normalization_23[0][0]']
average_pooling2d_2 (AveragePo (None, 29, 29, 288) 0 ['mixed1[0][0]']
oling2D)
conv2d_19 (Conv2D) (None, 29, 29, 64) 18432 ['mixed1[0][0]']
conv2d_21 (Conv2D) (None, 29, 29, 64) 76800 ['activation_20[0][0]']
conv2d_24 (Conv2D) (None, 29, 29, 96) 82944 ['activation_23[0][0]']
conv2d_25 (Conv2D) (None, 29, 29, 64) 18432 ['average_pooling2d_2[0][0]']
batch_normalization_19 (BatchN (None, 29, 29, 64) 192 ['conv2d_19[0][0]']
ormalization)
batch_normalization_21 (BatchN (None, 29, 29, 64) 192 ['conv2d_21[0][0]']
ormalization)
batch_normalization_24 (BatchN (None, 29, 29, 96) 288 ['conv2d_24[0][0]']
ormalization)
batch_normalization_25 (BatchN (None, 29, 29, 64) 192 ['conv2d_25[0][0]']
ormalization)
activation_19 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_19[0][0]']
activation_21 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_21[0][0]']
activation_24 (Activation) (None, 29, 29, 96) 0 ['batch_normalization_24[0][0]']
activation_25 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_25[0][0]']
mixed2 (Concatenate) (None, 29, 29, 288) 0 ['activation_19[0][0]',
'activation_21[0][0]',
'activation_24[0][0]',
'activation_25[0][0]']
conv2d_27 (Conv2D) (None, 29, 29, 64) 18432 ['mixed2[0][0]']
batch_normalization_27 (BatchN (None, 29, 29, 64) 192 ['conv2d_27[0][0]']
ormalization)
activation_27 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_27[0][0]']
conv2d_28 (Conv2D) (None, 29, 29, 96) 55296 ['activation_27[0][0]']
batch_normalization_28 (BatchN (None, 29, 29, 96) 288 ['conv2d_28[0][0]']
ormalization)
activation_28 (Activation) (None, 29, 29, 96) 0 ['batch_normalization_28[0][0]']
conv2d_26 (Conv2D) (None, 14, 14, 384) 995328 ['mixed2[0][0]']
conv2d_29 (Conv2D) (None, 14, 14, 96) 82944 ['activation_28[0][0]']
batch_normalization_26 (BatchN (None, 14, 14, 384) 1152 ['conv2d_26[0][0]']
ormalization)
batch_normalization_29 (BatchN (None, 14, 14, 96) 288 ['conv2d_29[0][0]']
ormalization)
activation_26 (Activation) (None, 14, 14, 384) 0 ['batch_normalization_26[0][0]']
activation_29 (Activation) (None, 14, 14, 96) 0 ['batch_normalization_29[0][0]']
max_pooling2d_2 (MaxPooling2D) (None, 14, 14, 288) 0 ['mixed2[0][0]']
mixed3 (Concatenate) (None, 14, 14, 768) 0 ['activation_26[0][0]',
'activation_29[0][0]',
'max_pooling2d_2[0][0]']
conv2d_34 (Conv2D) (None, 14, 14, 128) 98304 ['mixed3[0][0]']
batch_normalization_34 (BatchN (None, 14, 14, 128) 384 ['conv2d_34[0][0]']
ormalization)
activation_34 (Activation) (None, 14, 14, 128) 0 ['batch_normalization_34[0][0]']
conv2d_35 (Conv2D) (None, 14, 14, 128) 114688 ['activation_34[0][0]']
batch_normalization_35 (BatchN (None, 14, 14, 128) 384 ['conv2d_35[0][0]']
ormalization)
activation_35 (Activation) (None, 14, 14, 128) 0 ['batch_normalization_35[0][0]']
conv2d_31 (Conv2D) (None, 14, 14, 128) 98304 ['mixed3[0][0]']
conv2d_36 (Conv2D) (None, 14, 14, 128) 114688 ['activation_35[0][0]']
batch_normalization_31 (BatchN (None, 14, 14, 128) 384 ['conv2d_31[0][0]']
ormalization)
batch_normalization_36 (BatchN (None, 14, 14, 128) 384 ['conv2d_36[0][0]']
ormalization)
activation_31 (Activation) (None, 14, 14, 128) 0 ['batch_normalization_31[0][0]']
activation_36 (Activation) (None, 14, 14, 128) 0 ['batch_normalization_36[0][0]']
conv2d_32 (Conv2D) (None, 14, 14, 128) 114688 ['activation_31[0][0]']
conv2d_37 (Conv2D) (None, 14, 14, 128) 114688 ['activation_36[0][0]']
batch_normalization_32 (BatchN (None, 14, 14, 128) 384 ['conv2d_32[0][0]']
ormalization)
batch_normalization_37 (BatchN (None, 14, 14, 128) 384 ['conv2d_37[0][0]']
ormalization)
activation_32 (Activation) (None, 14, 14, 128) 0 ['batch_normalization_32[0][0]']
activation_37 (Activation) (None, 14, 14, 128) 0 ['batch_normalization_37[0][0]']
average_pooling2d_3 (AveragePo (None, 14, 14, 768) 0 ['mixed3[0][0]']
oling2D)
conv2d_30 (Conv2D) (None, 14, 14, 192) 147456 ['mixed3[0][0]']
conv2d_33 (Conv2D) (None, 14, 14, 192) 172032 ['activation_32[0][0]']
conv2d_38 (Conv2D) (None, 14, 14, 192) 172032 ['activation_37[0][0]']
conv2d_39 (Conv2D) (None, 14, 14, 192) 147456 ['average_pooling2d_3[0][0]']
batch_normalization_30 (BatchN (None, 14, 14, 192) 576 ['conv2d_30[0][0]']
ormalization)
batch_normalization_33 (BatchN (None, 14, 14, 192) 576 ['conv2d_33[0][0]']
ormalization)
batch_normalization_38 (BatchN (None, 14, 14, 192) 576 ['conv2d_38[0][0]']
ormalization)
batch_normalization_39 (BatchN (None, 14, 14, 192) 576 ['conv2d_39[0][0]']
ormalization)
activation_30 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_30[0][0]']
activation_33 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_33[0][0]']
activation_38 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_38[0][0]']
activation_39 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_39[0][0]']
mixed4 (Concatenate) (None, 14, 14, 768) 0 ['activation_30[0][0]',
'activation_33[0][0]',
'activation_38[0][0]',
'activation_39[0][0]']
conv2d_44 (Conv2D) (None, 14, 14, 160) 122880 ['mixed4[0][0]']
batch_normalization_44 (BatchN (None, 14, 14, 160) 480 ['conv2d_44[0][0]']
ormalization)
activation_44 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_44[0][0]']
conv2d_45 (Conv2D) (None, 14, 14, 160) 179200 ['activation_44[0][0]']
batch_normalization_45 (BatchN (None, 14, 14, 160) 480 ['conv2d_45[0][0]']
ormalization)
activation_45 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_45[0][0]']
conv2d_41 (Conv2D) (None, 14, 14, 160) 122880 ['mixed4[0][0]']
conv2d_46 (Conv2D) (None, 14, 14, 160) 179200 ['activation_45[0][0]']
batch_normalization_41 (BatchN (None, 14, 14, 160) 480 ['conv2d_41[0][0]']
ormalization)
batch_normalization_46 (BatchN (None, 14, 14, 160) 480 ['conv2d_46[0][0]']
ormalization)
activation_41 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_41[0][0]']
activation_46 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_46[0][0]']
conv2d_42 (Conv2D) (None, 14, 14, 160) 179200 ['activation_41[0][0]']
conv2d_47 (Conv2D) (None, 14, 14, 160) 179200 ['activation_46[0][0]']
batch_normalization_42 (BatchN (None, 14, 14, 160) 480 ['conv2d_42[0][0]']
ormalization)
batch_normalization_47 (BatchN (None, 14, 14, 160) 480 ['conv2d_47[0][0]']
ormalization)
activation_42 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_42[0][0]']
activation_47 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_47[0][0]']
average_pooling2d_4 (AveragePo (None, 14, 14, 768) 0 ['mixed4[0][0]']
oling2D)
conv2d_40 (Conv2D) (None, 14, 14, 192) 147456 ['mixed4[0][0]']
conv2d_43 (Conv2D) (None, 14, 14, 192) 215040 ['activation_42[0][0]']
conv2d_48 (Conv2D) (None, 14, 14, 192) 215040 ['activation_47[0][0]']
conv2d_49 (Conv2D) (None, 14, 14, 192) 147456 ['average_pooling2d_4[0][0]']
batch_normalization_40 (BatchN (None, 14, 14, 192) 576 ['conv2d_40[0][0]']
ormalization)
batch_normalization_43 (BatchN (None, 14, 14, 192) 576 ['conv2d_43[0][0]']
ormalization)
batch_normalization_48 (BatchN (None, 14, 14, 192) 576 ['conv2d_48[0][0]']
ormalization)
batch_normalization_49 (BatchN (None, 14, 14, 192) 576 ['conv2d_49[0][0]']
ormalization)
activation_40 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_40[0][0]']
activation_43 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_43[0][0]']
activation_48 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_48[0][0]']
activation_49 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_49[0][0]']
mixed5 (Concatenate) (None, 14, 14, 768) 0 ['activation_40[0][0]',
'activation_43[0][0]',
'activation_48[0][0]',
'activation_49[0][0]']
conv2d_54 (Conv2D) (None, 14, 14, 160) 122880 ['mixed5[0][0]']
batch_normalization_54 (BatchN (None, 14, 14, 160) 480 ['conv2d_54[0][0]']
ormalization)
activation_54 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_54[0][0]']
conv2d_55 (Conv2D) (None, 14, 14, 160) 179200 ['activation_54[0][0]']
batch_normalization_55 (BatchN (None, 14, 14, 160) 480 ['conv2d_55[0][0]']
ormalization)
activation_55 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_55[0][0]']
conv2d_51 (Conv2D) (None, 14, 14, 160) 122880 ['mixed5[0][0]']
conv2d_56 (Conv2D) (None, 14, 14, 160) 179200 ['activation_55[0][0]']
batch_normalization_51 (BatchN (None, 14, 14, 160) 480 ['conv2d_51[0][0]']
ormalization)
batch_normalization_56 (BatchN (None, 14, 14, 160) 480 ['conv2d_56[0][0]']
ormalization)
activation_51 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_51[0][0]']
activation_56 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_56[0][0]']
conv2d_52 (Conv2D) (None, 14, 14, 160) 179200 ['activation_51[0][0]']
conv2d_57 (Conv2D) (None, 14, 14, 160) 179200 ['activation_56[0][0]']
batch_normalization_52 (BatchN (None, 14, 14, 160) 480 ['conv2d_52[0][0]']
ormalization)
batch_normalization_57 (BatchN (None, 14, 14, 160) 480 ['conv2d_57[0][0]']
ormalization)
activation_52 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_52[0][0]']
activation_57 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_57[0][0]']
average_pooling2d_5 (AveragePo (None, 14, 14, 768) 0 ['mixed5[0][0]']
oling2D)
conv2d_50 (Conv2D) (None, 14, 14, 192) 147456 ['mixed5[0][0]']
conv2d_53 (Conv2D) (None, 14, 14, 192) 215040 ['activation_52[0][0]']
conv2d_58 (Conv2D) (None, 14, 14, 192) 215040 ['activation_57[0][0]']
conv2d_59 (Conv2D) (None, 14, 14, 192) 147456 ['average_pooling2d_5[0][0]']
batch_normalization_50 (BatchN (None, 14, 14, 192) 576 ['conv2d_50[0][0]']
ormalization)
batch_normalization_53 (BatchN (None, 14, 14, 192) 576 ['conv2d_53[0][0]']
ormalization)
batch_normalization_58 (BatchN (None, 14, 14, 192) 576 ['conv2d_58[0][0]']
ormalization)
batch_normalization_59 (BatchN (None, 14, 14, 192) 576 ['conv2d_59[0][0]']
ormalization)
activation_50 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_50[0][0]']
activation_53 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_53[0][0]']
activation_58 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_58[0][0]']
activation_59 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_59[0][0]']
mixed6 (Concatenate) (None, 14, 14, 768) 0 ['activation_50[0][0]',
'activation_53[0][0]',
'activation_58[0][0]',
'activation_59[0][0]']
conv2d_64 (Conv2D) (None, 14, 14, 192) 147456 ['mixed6[0][0]']
batch_normalization_64 (BatchN (None, 14, 14, 192) 576 ['conv2d_64[0][0]']
ormalization)
activation_64 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_64[0][0]']
conv2d_65 (Conv2D) (None, 14, 14, 192) 258048 ['activation_64[0][0]']
batch_normalization_65 (BatchN (None, 14, 14, 192) 576 ['conv2d_65[0][0]']
ormalization)
activation_65 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_65[0][0]']
conv2d_61 (Conv2D) (None, 14, 14, 192) 147456 ['mixed6[0][0]']
conv2d_66 (Conv2D) (None, 14, 14, 192) 258048 ['activation_65[0][0]']
batch_normalization_61 (BatchN (None, 14, 14, 192) 576 ['conv2d_61[0][0]']
ormalization)
batch_normalization_66 (BatchN (None, 14, 14, 192) 576 ['conv2d_66[0][0]']
ormalization)
activation_61 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_61[0][0]']
activation_66 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_66[0][0]']
conv2d_62 (Conv2D) (None, 14, 14, 192) 258048 ['activation_61[0][0]']
conv2d_67 (Conv2D) (None, 14, 14, 192) 258048 ['activation_66[0][0]']
batch_normalization_62 (BatchN (None, 14, 14, 192) 576 ['conv2d_62[0][0]']
ormalization)
batch_normalization_67 (BatchN (None, 14, 14, 192) 576 ['conv2d_67[0][0]']
ormalization)
activation_62 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_62[0][0]']
activation_67 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_67[0][0]']
average_pooling2d_6 (AveragePo (None, 14, 14, 768) 0 ['mixed6[0][0]']
oling2D)
conv2d_60 (Conv2D) (None, 14, 14, 192) 147456 ['mixed6[0][0]']
conv2d_63 (Conv2D) (None, 14, 14, 192) 258048 ['activation_62[0][0]']
conv2d_68 (Conv2D) (None, 14, 14, 192) 258048 ['activation_67[0][0]']
conv2d_69 (Conv2D) (None, 14, 14, 192) 147456 ['average_pooling2d_6[0][0]']
batch_normalization_60 (BatchN (None, 14, 14, 192) 576 ['conv2d_60[0][0]']
ormalization)
batch_normalization_63 (BatchN (None, 14, 14, 192) 576 ['conv2d_63[0][0]']
ormalization)
batch_normalization_68 (BatchN (None, 14, 14, 192) 576 ['conv2d_68[0][0]']
ormalization)
batch_normalization_69 (BatchN (None, 14, 14, 192) 576 ['conv2d_69[0][0]']
ormalization)
activation_60 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_60[0][0]']
activation_63 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_63[0][0]']
activation_68 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_68[0][0]']
activation_69 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_69[0][0]']
mixed7 (Concatenate) (None, 14, 14, 768) 0 ['activation_60[0][0]',
'activation_63[0][0]',
'activation_68[0][0]',
'activation_69[0][0]']
conv2d_72 (Conv2D) (None, 14, 14, 192) 147456 ['mixed7[0][0]']
batch_normalization_72 (BatchN (None, 14, 14, 192) 576 ['conv2d_72[0][0]']
ormalization)
activation_72 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_72[0][0]']
conv2d_73 (Conv2D) (None, 14, 14, 192) 258048 ['activation_72[0][0]']
batch_normalization_73 (BatchN (None, 14, 14, 192) 576 ['conv2d_73[0][0]']
ormalization)
activation_73 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_73[0][0]']
conv2d_70 (Conv2D) (None, 14, 14, 192) 147456 ['mixed7[0][0]']
conv2d_74 (Conv2D) (None, 14, 14, 192) 258048 ['activation_73[0][0]']
batch_normalization_70 (BatchN (None, 14, 14, 192) 576 ['conv2d_70[0][0]']
ormalization)
batch_normalization_74 (BatchN (None, 14, 14, 192) 576 ['conv2d_74[0][0]']
ormalization)
activation_70 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_70[0][0]']
activation_74 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_74[0][0]']
conv2d_71 (Conv2D) (None, 6, 6, 320) 552960 ['activation_70[0][0]']
conv2d_75 (Conv2D) (None, 6, 6, 192) 331776 ['activation_74[0][0]']
batch_normalization_71 (BatchN (None, 6, 6, 320) 960 ['conv2d_71[0][0]']
ormalization)
batch_normalization_75 (BatchN (None, 6, 6, 192) 576 ['conv2d_75[0][0]']
ormalization)
activation_71 (Activation) (None, 6, 6, 320) 0 ['batch_normalization_71[0][0]']
activation_75 (Activation) (None, 6, 6, 192) 0 ['batch_normalization_75[0][0]']
max_pooling2d_3 (MaxPooling2D) (None, 6, 6, 768) 0 ['mixed7[0][0]']
mixed8 (Concatenate) (None, 6, 6, 1280) 0 ['activation_71[0][0]',
'activation_75[0][0]',
'max_pooling2d_3[0][0]']
conv2d_80 (Conv2D) (None, 6, 6, 448) 573440 ['mixed8[0][0]']
batch_normalization_80 (BatchN (None, 6, 6, 448) 1344 ['conv2d_80[0][0]']
ormalization)
activation_80 (Activation) (None, 6, 6, 448) 0 ['batch_normalization_80[0][0]']
conv2d_77 (Conv2D) (None, 6, 6, 384) 491520 ['mixed8[0][0]']
conv2d_81 (Conv2D) (None, 6, 6, 384) 1548288 ['activation_80[0][0]']
batch_normalization_77 (BatchN (None, 6, 6, 384) 1152 ['conv2d_77[0][0]']
ormalization)
batch_normalization_81 (BatchN (None, 6, 6, 384) 1152 ['conv2d_81[0][0]']
ormalization)
activation_77 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_77[0][0]']
activation_81 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_81[0][0]']
conv2d_78 (Conv2D) (None, 6, 6, 384) 442368 ['activation_77[0][0]']
conv2d_79 (Conv2D) (None, 6, 6, 384) 442368 ['activation_77[0][0]']
conv2d_82 (Conv2D) (None, 6, 6, 384) 442368 ['activation_81[0][0]']
conv2d_83 (Conv2D) (None, 6, 6, 384) 442368 ['activation_81[0][0]']
average_pooling2d_7 (AveragePo (None, 6, 6, 1280) 0 ['mixed8[0][0]']
oling2D)
conv2d_76 (Conv2D) (None, 6, 6, 320) 409600 ['mixed8[0][0]']
batch_normalization_78 (BatchN (None, 6, 6, 384) 1152 ['conv2d_78[0][0]']
ormalization)
batch_normalization_79 (BatchN (None, 6, 6, 384) 1152 ['conv2d_79[0][0]']
ormalization)
batch_normalization_82 (BatchN (None, 6, 6, 384) 1152 ['conv2d_82[0][0]']
ormalization)
batch_normalization_83 (BatchN (None, 6, 6, 384) 1152 ['conv2d_83[0][0]']
ormalization)
conv2d_84 (Conv2D) (None, 6, 6, 192) 245760 ['average_pooling2d_7[0][0]']
batch_normalization_76 (BatchN (None, 6, 6, 320) 960 ['conv2d_76[0][0]']
ormalization)
activation_78 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_78[0][0]']
activation_79 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_79[0][0]']
activation_82 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_82[0][0]']
activation_83 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_83[0][0]']
batch_normalization_84 (BatchN (None, 6, 6, 192) 576 ['conv2d_84[0][0]']
ormalization)
activation_76 (Activation) (None, 6, 6, 320) 0 ['batch_normalization_76[0][0]']
mixed9_0 (Concatenate) (None, 6, 6, 768) 0 ['activation_78[0][0]',
'activation_79[0][0]']
concatenate (Concatenate) (None, 6, 6, 768) 0 ['activation_82[0][0]',
'activation_83[0][0]']
activation_84 (Activation) (None, 6, 6, 192) 0 ['batch_normalization_84[0][0]']
mixed9 (Concatenate) (None, 6, 6, 2048) 0 ['activation_76[0][0]',
'mixed9_0[0][0]',
'concatenate[0][0]',
'activation_84[0][0]']
conv2d_89 (Conv2D) (None, 6, 6, 448) 917504 ['mixed9[0][0]']
batch_normalization_89 (BatchN (None, 6, 6, 448) 1344 ['conv2d_89[0][0]']
ormalization)
activation_89 (Activation) (None, 6, 6, 448) 0 ['batch_normalization_89[0][0]']
conv2d_86 (Conv2D) (None, 6, 6, 384) 786432 ['mixed9[0][0]']
conv2d_90 (Conv2D) (None, 6, 6, 384) 1548288 ['activation_89[0][0]']
batch_normalization_86 (BatchN (None, 6, 6, 384) 1152 ['conv2d_86[0][0]']
ormalization)
batch_normalization_90 (BatchN (None, 6, 6, 384) 1152 ['conv2d_90[0][0]']
ormalization)
activation_86 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_86[0][0]']
activation_90 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_90[0][0]']
conv2d_87 (Conv2D) (None, 6, 6, 384) 442368 ['activation_86[0][0]']
conv2d_88 (Conv2D) (None, 6, 6, 384) 442368 ['activation_86[0][0]']
conv2d_91 (Conv2D) (None, 6, 6, 384) 442368 ['activation_90[0][0]']
conv2d_92 (Conv2D) (None, 6, 6, 384) 442368 ['activation_90[0][0]']
average_pooling2d_8 (AveragePo (None, 6, 6, 2048) 0 ['mixed9[0][0]']
oling2D)
conv2d_85 (Conv2D) (None, 6, 6, 320) 655360 ['mixed9[0][0]']
batch_normalization_87 (BatchN (None, 6, 6, 384) 1152 ['conv2d_87[0][0]']
ormalization)
batch_normalization_88 (BatchN (None, 6, 6, 384) 1152 ['conv2d_88[0][0]']
ormalization)
batch_normalization_91 (BatchN (None, 6, 6, 384) 1152 ['conv2d_91[0][0]']
ormalization)
batch_normalization_92 (BatchN (None, 6, 6, 384) 1152 ['conv2d_92[0][0]']
ormalization)
conv2d_93 (Conv2D) (None, 6, 6, 192) 393216 ['average_pooling2d_8[0][0]']
batch_normalization_85 (BatchN (None, 6, 6, 320) 960 ['conv2d_85[0][0]']
ormalization)
activation_87 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_87[0][0]']
activation_88 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_88[0][0]']
activation_91 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_91[0][0]']
activation_92 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_92[0][0]']
batch_normalization_93 (BatchN (None, 6, 6, 192) 576 ['conv2d_93[0][0]']
ormalization)
activation_85 (Activation) (None, 6, 6, 320) 0 ['batch_normalization_85[0][0]']
mixed9_1 (Concatenate) (None, 6, 6, 768) 0 ['activation_87[0][0]',
'activation_88[0][0]']
concatenate_1 (Concatenate) (None, 6, 6, 768) 0 ['activation_91[0][0]',
'activation_92[0][0]']
activation_93 (Activation) (None, 6, 6, 192) 0 ['batch_normalization_93[0][0]']
mixed10 (Concatenate) (None, 6, 6, 2048) 0 ['activation_85[0][0]',
'mixed9_1[0][0]',
'concatenate_1[0][0]',
'activation_93[0][0]']
flatten (Flatten) (None, 73728) 0 ['mixed10[0][0]']
==================================================================================================
Total params: 21,802,784
Trainable params: 0
Non-trainable params: 21,802,784
__________________________________________________________________________________________________
Model Compilation
from keras.models import Sequential
from keras.layers import Dropout
input_shape=(256,256,3)
in_model = Sequential()
in_model.add(inc_model)
in_model.add(Dense(512, activation='relu', input_dim=input_shape))
in_model.add(Dropout(0.3))
in_model.add(Dense(512, activation='relu'))
in_model.add(Dropout(0.3))
in_model.add(Dense(1, activation='sigmoid'))
in_model.compile(optimizer = 'adam',
loss = 'binary_crossentropy',
metrics = ['accuracy'])
in_model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
model (Functional) (None, 73728) 21802784
dense (Dense) (None, 512) 37749248
dropout (Dropout) (None, 512) 0
dense_1 (Dense) (None, 512) 262656
dropout_1 (Dropout) (None, 512) 0
dense_2 (Dense) (None, 1) 513
=================================================================
Total params: 59,815,201
Trainable params: 38,012,417
Non-trainable params: 21,802,784
_________________________________________________________________
Model Training
in_model.fit(train,epochs=20,callbacks=[reduce_lr],steps_per_epoch=100,validation_data=test,class_weight=class_weight)
Saving the Model
in_model.save(f'/content/drive/MyDrive/AI_ML_Projects/Capstone Project/working_data/model_inceptionnet.h5')
Plotting & Validation Accuracy
plt.figure(figsize=(30,20))
ival_acc=np.asarray(in_model.history.history['val_accuracy'])*100
iacc=np.asarray(in_model.history.history['accuracy'])*100
iacc=pd.DataFrame({'val_acc':ival_acc,'acc':iacc})
iacc.plot(figsize=(20,10),yticks=range(50,100,5))
Plotting & Validation Loss
in_loss=in_model.history.history['loss']
in_val_loss=in_model.history.history['val_loss']
in_loss=pd.DataFrame({'val_loss':in_val_loss,'loss':in_loss})
in_loss.plot(figsize=(20,10))
Model testing
y=[]
test.reset()
for i in tqdm(range(4)):
_,target=test.__getitem__(i)
for j in target:
y.append(j)
100%|██████████| 4/4 [00:01<00:00, 2.32it/s]
test.reset()
y_predi=in_model.predict(test)
predi=[]
for i in y_predi:
if i[0]>=0.5:
predi.append(1)
else:
predi.append(0)
Classification report & ROC Curve
print(classification_report(y,predi[:len(y)]))
precision recall f1-score support
0.0 0.96 0.87 0.91 107
1.0 0.55 0.81 0.65 21
accuracy 0.86 128
macro avg 0.75 0.84 0.78 128
weighted avg 0.89 0.86 0.87 128
plt.figure(figsize=(20,10))
fpri,tpri,_=roc_curve(y,y_predi[:len(y)])
area_under_curvei=auc(fpri,tpri)
print('The area under the curve is:',area_under_curvei)
# Plot area under curve
plt.plot(fpri,tpri,'b.-')
plt.xlabel('false positive rate')
plt.ylabel('true positive rate')
plt.plot(fpri,fpri,linestyle='--',color='black')
The area under the curve is: 0.8680462839341344
[<matplotlib.lines.Line2D at 0x7f9d0b5b3050>]